sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
listlengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
listlengths 0
25
| languages
listlengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
listlengths 0
352
| processed_texts
listlengths 1
353
| tokens_length
listlengths 1
353
| input_texts
listlengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
47cb94354b42902ad4913b659640d2b204223b62
|
# Dataset Card for Evaluation run of marcchew/Marcoroni-7B-LaMini-40K
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/marcchew/Marcoroni-7B-LaMini-40K
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [marcchew/Marcoroni-7B-LaMini-40K](https://huggingface.co/marcchew/Marcoroni-7B-LaMini-40K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-40K",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T19:34:57.307967](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-40K/blob/main/results_2023-12-03T19-34-57.307967.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-40K
|
[
"region:us"
] |
2023-09-18T13:23:12+00:00
|
{"pretty_name": "Evaluation run of marcchew/Marcoroni-7B-LaMini-40K", "dataset_summary": "Dataset automatically created during the evaluation run of model [marcchew/Marcoroni-7B-LaMini-40K](https://huggingface.co/marcchew/Marcoroni-7B-LaMini-40K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-40K\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T19:34:57.307967](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__Marcoroni-7B-LaMini-40K/blob/main/results_2023-12-03T19-34-57.307967.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/marcchew/Marcoroni-7B-LaMini-40K", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T16_12_33.522830", "path": ["**/details_harness|drop|3_2023-10-25T16-12-33.522830.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T16-12-33.522830.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T16_12_33.522830", "path": ["**/details_harness|gsm8k|5_2023-10-25T16-12-33.522830.parquet"]}, {"split": "2023_12_03T19_34_57.307967", "path": ["**/details_harness|gsm8k|5_2023-12-03T19-34-57.307967.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T19-34-57.307967.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-22-48.761056.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-22-48.761056.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-22-48.761056.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T16_12_33.522830", "path": ["**/details_harness|winogrande|5_2023-10-25T16-12-33.522830.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T16-12-33.522830.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T14_22_48.761056", "path": ["results_2023-09-18T14-22-48.761056.parquet"]}, {"split": "2023_10_25T16_12_33.522830", "path": ["results_2023-10-25T16-12-33.522830.parquet"]}, {"split": "2023_12_03T19_34_57.307967", "path": ["results_2023-12-03T19-34-57.307967.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T19-34-57.307967.parquet"]}]}]}
|
2023-12-03T19:35:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of marcchew/Marcoroni-7B-LaMini-40K
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model marcchew/Marcoroni-7B-LaMini-40K on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-03T19:34:57.307967(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of marcchew/Marcoroni-7B-LaMini-40K",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model marcchew/Marcoroni-7B-LaMini-40K on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-03T19:34:57.307967(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of marcchew/Marcoroni-7B-LaMini-40K",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model marcchew/Marcoroni-7B-LaMini-40K on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-03T19:34:57.307967(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of marcchew/Marcoroni-7B-LaMini-40K## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model marcchew/Marcoroni-7B-LaMini-40K on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T19:34:57.307967(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
29d1fdc578ec3f2260d90f875493a59059a7b183
|
# Dataset Card for Evaluation run of Lazycuber/L2-7b-Guanaco-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Lazycuber/L2-7b-Guanaco-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Lazycuber/L2-7b-Guanaco-Uncensored](https://huggingface.co/Lazycuber/L2-7b-Guanaco-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T07:21:20.584203](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Uncensored/blob/main/results_2023-10-25T07-21-20.584203.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902985767,
"f1": 0.056828859060402796,
"f1_stderr": 0.0013179206618636607,
"acc": 0.416677387679193,
"acc_stderr": 0.0097821448230569
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902985767,
"f1": 0.056828859060402796,
"f1_stderr": 0.0013179206618636607
},
"harness|gsm8k|5": {
"acc": 0.07960576194086429,
"acc_stderr": 0.007455924338676278
},
"harness|winogrande|5": {
"acc": 0.7537490134175217,
"acc_stderr": 0.012108365307437523
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Uncensored
|
[
"region:us"
] |
2023-09-18T13:25:04+00:00
|
{"pretty_name": "Evaluation run of Lazycuber/L2-7b-Guanaco-Uncensored", "dataset_summary": "Dataset automatically created during the evaluation run of model [Lazycuber/L2-7b-Guanaco-Uncensored](https://huggingface.co/Lazycuber/L2-7b-Guanaco-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Uncensored\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-25T07:21:20.584203](https://huggingface.co/datasets/open-llm-leaderboard/details_Lazycuber__L2-7b-Guanaco-Uncensored/blob/main/results_2023-10-25T07-21-20.584203.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902985767,\n \"f1\": 0.056828859060402796,\n \"f1_stderr\": 0.0013179206618636607,\n \"acc\": 0.416677387679193,\n \"acc_stderr\": 0.0097821448230569\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902985767,\n \"f1\": 0.056828859060402796,\n \"f1_stderr\": 0.0013179206618636607\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07960576194086429,\n \"acc_stderr\": 0.007455924338676278\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7537490134175217,\n \"acc_stderr\": 0.012108365307437523\n }\n}\n```", "repo_url": "https://huggingface.co/Lazycuber/L2-7b-Guanaco-Uncensored", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_25T07_21_20.584203", "path": ["**/details_harness|drop|3_2023-10-25T07-21-20.584203.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-25T07-21-20.584203.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_25T07_21_20.584203", "path": ["**/details_harness|gsm8k|5_2023-10-25T07-21-20.584203.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-25T07-21-20.584203.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-24-41.596109.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-24-41.596109.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-24-41.596109.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_25T07_21_20.584203", "path": ["**/details_harness|winogrande|5_2023-10-25T07-21-20.584203.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-25T07-21-20.584203.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T14_24_41.596109", "path": ["results_2023-09-18T14-24-41.596109.parquet"]}, {"split": "2023_10_25T07_21_20.584203", "path": ["results_2023-10-25T07-21-20.584203.parquet"]}, {"split": "latest", "path": ["results_2023-10-25T07-21-20.584203.parquet"]}]}]}
|
2023-10-25T06:21:33+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Lazycuber/L2-7b-Guanaco-Uncensored
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Lazycuber/L2-7b-Guanaco-Uncensored on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-25T07:21:20.584203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Lazycuber/L2-7b-Guanaco-Uncensored",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lazycuber/L2-7b-Guanaco-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T07:21:20.584203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Lazycuber/L2-7b-Guanaco-Uncensored",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lazycuber/L2-7b-Guanaco-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-25T07:21:20.584203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
26,
31,
174,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Lazycuber/L2-7b-Guanaco-Uncensored## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Lazycuber/L2-7b-Guanaco-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-25T07:21:20.584203(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
a0edf0224186ee86d140da9e4443827a0b9a4bcc
|
# Dataset Card for Evaluation run of health360/Healix-410M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/health360/Healix-410M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [health360/Healix-410M](https://huggingface.co/health360/Healix-410M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_health360__Healix-410M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T06:52:49.299650](https://huggingface.co/datasets/open-llm-leaderboard/details_health360__Healix-410M/blob/main/results_2023-10-28T06-52-49.299650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.010591442953020135,
"em_stderr": 0.0010483469790502306,
"f1": 0.049055159395973116,
"f1_stderr": 0.0015554088384130706,
"acc": 0.27071823204419887,
"acc_stderr": 0.007002073426895943
},
"harness|drop|3": {
"em": 0.010591442953020135,
"em_stderr": 0.0010483469790502306,
"f1": 0.049055159395973116,
"f1_stderr": 0.0015554088384130706
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5414364640883977,
"acc_stderr": 0.014004146853791886
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_health360__Healix-410M
|
[
"region:us"
] |
2023-09-18T13:26:07+00:00
|
{"pretty_name": "Evaluation run of health360/Healix-410M", "dataset_summary": "Dataset automatically created during the evaluation run of model [health360/Healix-410M](https://huggingface.co/health360/Healix-410M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_health360__Healix-410M\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T06:52:49.299650](https://huggingface.co/datasets/open-llm-leaderboard/details_health360__Healix-410M/blob/main/results_2023-10-28T06-52-49.299650.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.010591442953020135,\n \"em_stderr\": 0.0010483469790502306,\n \"f1\": 0.049055159395973116,\n \"f1_stderr\": 0.0015554088384130706,\n \"acc\": 0.27071823204419887,\n \"acc_stderr\": 0.007002073426895943\n },\n \"harness|drop|3\": {\n \"em\": 0.010591442953020135,\n \"em_stderr\": 0.0010483469790502306,\n \"f1\": 0.049055159395973116,\n \"f1_stderr\": 0.0015554088384130706\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5414364640883977,\n \"acc_stderr\": 0.014004146853791886\n }\n}\n```", "repo_url": "https://huggingface.co/health360/Healix-410M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T06_49_18.544875", "path": ["**/details_harness|drop|3_2023-10-28T06-49-18.544875.parquet"]}, {"split": "2023_10_28T06_52_49.299650", "path": ["**/details_harness|drop|3_2023-10-28T06-52-49.299650.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T06-52-49.299650.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T06_49_18.544875", "path": ["**/details_harness|gsm8k|5_2023-10-28T06-49-18.544875.parquet"]}, {"split": "2023_10_28T06_52_49.299650", "path": ["**/details_harness|gsm8k|5_2023-10-28T06-52-49.299650.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T06-52-49.299650.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-25-49.264800.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-25-49.264800.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-25-49.264800.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T06_49_18.544875", "path": ["**/details_harness|winogrande|5_2023-10-28T06-49-18.544875.parquet"]}, {"split": "2023_10_28T06_52_49.299650", "path": ["**/details_harness|winogrande|5_2023-10-28T06-52-49.299650.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T06-52-49.299650.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T14_25_49.264800", "path": ["results_2023-09-18T14-25-49.264800.parquet"]}, {"split": "2023_10_28T06_49_18.544875", "path": ["results_2023-10-28T06-49-18.544875.parquet"]}, {"split": "2023_10_28T06_52_49.299650", "path": ["results_2023-10-28T06-52-49.299650.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T06-52-49.299650.parquet"]}]}]}
|
2023-10-28T05:52:57+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of health360/Healix-410M
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model health360/Healix-410M on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-28T06:52:49.299650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of health360/Healix-410M",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model health360/Healix-410M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T06:52:49.299650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of health360/Healix-410M",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model health360/Healix-410M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T06:52:49.299650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
18,
31,
166,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of health360/Healix-410M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model health360/Healix-410M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T06:52:49.299650(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ccc0e3b816ce2d85f9b74081eebcd7f8c5d1d5ce
|
# Dataset Card for Evaluation run of TinyPixel/testmodel2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TinyPixel/testmodel2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TinyPixel/testmodel2](https://huggingface.co/TinyPixel/testmodel2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TinyPixel__testmodel2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T13:54:24.629963](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__testmodel2/blob/main/results_2023-10-24T13-54-24.629963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931189794,
"f1": 0.05664848993288591,
"f1_stderr": 0.001329470291478584,
"acc": 0.4072684276253865,
"acc_stderr": 0.009841754656544565
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.00036305608931189794,
"f1": 0.05664848993288591,
"f1_stderr": 0.001329470291478584
},
"harness|gsm8k|5": {
"acc": 0.07657316148597422,
"acc_stderr": 0.007324564881451568
},
"harness|winogrande|5": {
"acc": 0.7379636937647988,
"acc_stderr": 0.012358944431637561
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TinyPixel__testmodel2
|
[
"region:us"
] |
2023-09-18T13:28:36+00:00
|
{"pretty_name": "Evaluation run of TinyPixel/testmodel2", "dataset_summary": "Dataset automatically created during the evaluation run of model [TinyPixel/testmodel2](https://huggingface.co/TinyPixel/testmodel2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TinyPixel__testmodel2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T13:54:24.629963](https://huggingface.co/datasets/open-llm-leaderboard/details_TinyPixel__testmodel2/blob/main/results_2023-10-24T13-54-24.629963.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931189794,\n \"f1\": 0.05664848993288591,\n \"f1_stderr\": 0.001329470291478584,\n \"acc\": 0.4072684276253865,\n \"acc_stderr\": 0.009841754656544565\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.00036305608931189794,\n \"f1\": 0.05664848993288591,\n \"f1_stderr\": 0.001329470291478584\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07657316148597422,\n \"acc_stderr\": 0.007324564881451568\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7379636937647988,\n \"acc_stderr\": 0.012358944431637561\n }\n}\n```", "repo_url": "https://huggingface.co/TinyPixel/testmodel2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T13_54_24.629963", "path": ["**/details_harness|drop|3_2023-10-24T13-54-24.629963.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T13-54-24.629963.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T13_54_24.629963", "path": ["**/details_harness|gsm8k|5_2023-10-24T13-54-24.629963.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T13-54-24.629963.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-28-17.558290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-28-17.558290.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-28-17.558290.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T13_54_24.629963", "path": ["**/details_harness|winogrande|5_2023-10-24T13-54-24.629963.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T13-54-24.629963.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T14_28_17.558290", "path": ["results_2023-09-18T14-28-17.558290.parquet"]}, {"split": "2023_10_24T13_54_24.629963", "path": ["results_2023-10-24T13-54-24.629963.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T13-54-24.629963.parquet"]}]}]}
|
2023-10-24T12:54:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TinyPixel/testmodel2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TinyPixel/testmodel2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T13:54:24.629963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TinyPixel/testmodel2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TinyPixel/testmodel2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T13:54:24.629963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TinyPixel/testmodel2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TinyPixel/testmodel2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T13:54:24.629963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TinyPixel/testmodel2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TinyPixel/testmodel2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T13:54:24.629963(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
37bee58038a43526b9d8b95ac6b119bb00c0d44b
|
# Dataset Card for "AO3_fandom_IO"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
ebony59/AO3_fandom_IO
|
[
"region:us"
] |
2023-09-18T13:30:41+00:00
|
{"dataset_info": {"features": [{"name": "input_text", "dtype": "string"}, {"name": "output_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13860266, "num_examples": 9402}], "download_size": 2550145, "dataset_size": 13860266}}
|
2023-11-08T11:10:00+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "AO3_fandom_IO"
More Information needed
|
[
"# Dataset Card for \"AO3_fandom_IO\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"AO3_fandom_IO\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"AO3_fandom_IO\"\n\nMore Information needed"
] |
6b2d5700c48d6acc235f24a6ee88152bc98e1cfb
|
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/euclaise/falcon_1b_stage2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage2](https://huggingface.co/euclaise/falcon_1b_stage2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_euclaise__falcon_1b_stage2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T17:03:53.294329](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage2/blob/main/results_2023-10-28T17-03-53.294329.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801232,
"f1": 0.047180159395973295,
"f1_stderr": 0.0011805397389622562,
"acc": 0.31176006314127863,
"acc_stderr": 0.006808465980333592
},
"harness|drop|3": {
"em": 0.0008389261744966443,
"em_stderr": 0.0002964962989801232,
"f1": 0.047180159395973295,
"f1_stderr": 0.0011805397389622562
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6235201262825573,
"acc_stderr": 0.013616931960667183
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_euclaise__falcon_1b_stage2
|
[
"region:us"
] |
2023-09-18T13:33:47+00:00
|
{"pretty_name": "Evaluation run of euclaise/falcon_1b_stage2", "dataset_summary": "Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage2](https://huggingface.co/euclaise/falcon_1b_stage2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__falcon_1b_stage2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T17:03:53.294329](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage2/blob/main/results_2023-10-28T17-03-53.294329.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801232,\n \"f1\": 0.047180159395973295,\n \"f1_stderr\": 0.0011805397389622562,\n \"acc\": 0.31176006314127863,\n \"acc_stderr\": 0.006808465980333592\n },\n \"harness|drop|3\": {\n \"em\": 0.0008389261744966443,\n \"em_stderr\": 0.0002964962989801232,\n \"f1\": 0.047180159395973295,\n \"f1_stderr\": 0.0011805397389622562\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6235201262825573,\n \"acc_stderr\": 0.013616931960667183\n }\n}\n```", "repo_url": "https://huggingface.co/euclaise/falcon_1b_stage2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|arc:challenge|25_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T08_07_03.547290", "path": ["**/details_harness|drop|3_2023-10-28T08-07-03.547290.parquet"]}, {"split": "2023_10_28T17_03_53.294329", "path": ["**/details_harness|drop|3_2023-10-28T17-03-53.294329.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T17-03-53.294329.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T08_07_03.547290", "path": ["**/details_harness|gsm8k|5_2023-10-28T08-07-03.547290.parquet"]}, {"split": "2023_10_28T17_03_53.294329", "path": ["**/details_harness|gsm8k|5_2023-10-28T17-03-53.294329.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T17-03-53.294329.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hellaswag|10_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-33-29.155732.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-management|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-virology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-management|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-virology|5_2023-10-03T15-01-26.920880.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-management|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-virology|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["**/details_harness|truthfulqa:mc|0_2023-10-03T15-01-26.920880.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-10-03T15-01-26.920880.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T08_07_03.547290", "path": ["**/details_harness|winogrande|5_2023-10-28T08-07-03.547290.parquet"]}, {"split": "2023_10_28T17_03_53.294329", "path": ["**/details_harness|winogrande|5_2023-10-28T17-03-53.294329.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T17-03-53.294329.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T14_33_29.155732", "path": ["results_2023-09-18T14-33-29.155732.parquet"]}, {"split": "2023_10_03T15_01_26.920880", "path": ["results_2023-10-03T15-01-26.920880.parquet"]}, {"split": "2023_10_28T08_07_03.547290", "path": ["results_2023-10-28T08-07-03.547290.parquet"]}, {"split": "2023_10_28T17_03_53.294329", "path": ["results_2023-10-28T17-03-53.294329.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T17-03-53.294329.parquet"]}]}]}
|
2023-10-28T16:04:06+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model euclaise/falcon_1b_stage2 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-28T17:03:53.294329(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of euclaise/falcon_1b_stage2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/falcon_1b_stage2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T17:03:53.294329(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of euclaise/falcon_1b_stage2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/falcon_1b_stage2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T17:03:53.294329(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of euclaise/falcon_1b_stage2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/falcon_1b_stage2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 4 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T17:03:53.294329(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
7bf507a59baed07d97d8609d6b567fa68369fdb6
|
# Dataset Card for "Fake News Opensources"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
<!--
- **Paper:** Fake News Opensources
-->
- **Homepage:** [https://github.com/AndyTheFactory/FakeNewsDataset](https://github.com/AndyTheFactory/FakeNewsDataset)
- **Repository:** [https://github.com/AndyTheFactory/FakeNewsDataset](https://github.com/AndyTheFactory/FakeNewsDataset)
- **Point of Contact:** [Andrei Paraschiv](https://github.com/AndyTheFactory)
-
### Dataset Summary
a consolidated and cleaned up version of the opensources Fake News dataset
Fake News Corpus comprises 8,529,090 individual articles, classified into 12 classes: reliable, unreliable, political, bias, fake, conspiracy,
rumor clickbait, junk science, satire, hate and unknown. The articles were scraped between the end of 2017 and the beginning of 2018 from various
news websites, totaling 647 distinct sources, collecting articles dating from various years leading to the 2016 US elections and the year after.
Documents were classified based on their source, based on the curated website list provided by opensources.co using a leading to a
high imbalanced class distribution. Their proposed source classification method, was based on six criteria:
- Title and Domain name analysis,
- “About Us” analysis,
- source or study mentioning,
- writing style analysis,
- aesthetic analysis and social media analysis.
After extensive data cleaning and duplicate removal we retain **5,915,569** records
### Languages
English
## Dataset Structure
### Data Instances
An example record looks as follows.
```
{
'id': 4059480,
'type': 'political',
'domain': 'dailycaller.com',
'scraped_at': '2017-11-27',
'url': 'http://dailycaller.com/buzz/massachusettsunited-states/page/2/',
'authors': 'Jeff Winkler, Jonathan Strong, Ken Blackwell, Pat Mcmahon, Julia Mcclatchy, Admin, Matt Purple',
'title': 'The Daily Caller',
'content':'New Hampshire is the state with the highest median income in the nation, according to the U.S. Census Bureau’s report on income, poverty and health insurance',
}
```
### Data Fields
- `id`: The unique article ID
- `type`: the label of the record (one of: reliable, unreliable, political, bias, fake, conspiracy,
rumor clickbait, junk science, satire, hate)
- 'scraped_at': date of the original scrape run
- 'url': original article url
- 'authors': comma separated list of scraped authors
- 'title': original scraped article title
- `content`: full article text
### Data Splits
Label | Nr Records
:---| :---:
reliable | 1807323
political | 968205
bias | 769874
fake | 762178
conspiracy | 494184
rumor | 375963
unknown | 230532
clickbait | 174176
unreliable | 104537
satire | 84735
junksci | 79099
hate | 64763
|
total | 5915569
## Dataset Creation
### Source Data
News Articles from various sites
#### Who are the source language producers?
News Articles, Blogs
### Annotations
#### Who are the annotators?
Journalists
### Other Known Limitations
The dataset was not manually filtered, therefore some of the labels might not be correct and some of the URLs might not point to the actual articles but other pages on the website. However, because the corpus is intended for use in training machine learning algorithms, those problems should not pose a practical issue.
Additionally, when the dataset will be finalised (as for now only about 80% was cleaned and published), I do not intend to update it, therefore it might quickly become outdated for other purposes than content-based algorithms. However, any contributions are welcome!
### Licensing Information
This data is available and distributed under Apache-2.0 license
### Citation Information
```
tbd
```
|
andyP/fake_news_en_opensources
|
[
"task_categories:text-classification",
"task_ids:topic-classification",
"task_ids:fact-checking",
"annotations_creators:expert-generated",
"language_creators:found",
"multilinguality:monolingual",
"size_categories:1M<n<10M",
"source_datasets:Opensources https://github.com/BigMcLargeHuge/opensources",
"source_datasets:FakeNews Corpus https://github.com/several27/FakeNewsCorpus",
"language:en",
"license:apache-2.0",
"fake-news-detection",
"fake news",
"english",
"nlp",
"region:us"
] |
2023-09-18T13:35:44+00:00
|
{"annotations_creators": ["expert-generated"], "language_creators": ["found"], "language": ["en"], "license": "apache-2.0", "multilinguality": ["monolingual"], "size_categories": ["1M<n<10M"], "source_datasets": ["Opensources https://github.com/BigMcLargeHuge/opensources", "FakeNews Corpus https://github.com/several27/FakeNewsCorpus"], "task_categories": ["text-classification"], "task_ids": ["topic-classification", "fact-checking"], "pretty_name": "Fake News Opensources", "tags": ["fake-news-detection", "fake news", "english", "nlp"], "dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "type", "dtype": "string"}, {"name": "domain", "dtype": "string"}, {"name": "scraped_at", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "authors", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "content", "dtype": "string"}]}}
|
2024-02-12T21:04:30+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-text-classification #task_ids-topic-classification #task_ids-fact-checking #annotations_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-1M<n<10M #source_datasets-Opensources https-//github.com/BigMcLargeHuge/opensources #source_datasets-FakeNews Corpus https-//github.com/several27/FakeNewsCorpus #language-English #license-apache-2.0 #fake-news-detection #fake news #english #nlp #region-us
|
Dataset Card for "Fake News Opensources"
========================================
Table of Contents
-----------------
* Dataset Description
+ Dataset Summary
+ Supported Tasks and Leaderboards
+ Languages
* Dataset Structure
+ Data Instances
+ Data Fields
+ Data Splits
* Dataset Creation
+ Curation Rationale
+ Source Data
+ Annotations
+ Personal and Sensitive Information
* Considerations for Using the Data
+ Social Impact of Dataset
+ Discussion of Biases
+ Other Known Limitations
* Additional Information
+ Dataset Curators
+ Licensing Information
+ Citation Information
+ Contributions
Dataset Description
-------------------
* Homepage: URL
* Repository: URL
* Point of Contact: Andrei Paraschiv
*
### Dataset Summary
a consolidated and cleaned up version of the opensources Fake News dataset
Fake News Corpus comprises 8,529,090 individual articles, classified into 12 classes: reliable, unreliable, political, bias, fake, conspiracy,
rumor clickbait, junk science, satire, hate and unknown. The articles were scraped between the end of 2017 and the beginning of 2018 from various
news websites, totaling 647 distinct sources, collecting articles dating from various years leading to the 2016 US elections and the year after.
Documents were classified based on their source, based on the curated website list provided by URL using a leading to a
high imbalanced class distribution. Their proposed source classification method, was based on six criteria:
* Title and Domain name analysis,
* “About Us” analysis,
* source or study mentioning,
* writing style analysis,
* aesthetic analysis and social media analysis.
After extensive data cleaning and duplicate removal we retain 5,915,569 records
### Languages
English
Dataset Structure
-----------------
### Data Instances
An example record looks as follows.
### Data Fields
* 'id': The unique article ID
* 'type': the label of the record (one of: reliable, unreliable, political, bias, fake, conspiracy,
rumor clickbait, junk science, satire, hate)
* 'scraped\_at': date of the original scrape run
* 'url': original article url
* 'authors': comma separated list of scraped authors
* 'title': original scraped article title
* 'content': full article text
### Data Splits
Dataset Creation
----------------
### Source Data
News Articles from various sites
#### Who are the source language producers?
News Articles, Blogs
### Annotations
#### Who are the annotators?
Journalists
### Other Known Limitations
The dataset was not manually filtered, therefore some of the labels might not be correct and some of the URLs might not point to the actual articles but other pages on the website. However, because the corpus is intended for use in training machine learning algorithms, those problems should not pose a practical issue.
Additionally, when the dataset will be finalised (as for now only about 80% was cleaned and published), I do not intend to update it, therefore it might quickly become outdated for other purposes than content-based algorithms. However, any contributions are welcome!
### Licensing Information
This data is available and distributed under Apache-2.0 license
|
[
"### Dataset Summary\n\n\na consolidated and cleaned up version of the opensources Fake News dataset\n\n\nFake News Corpus comprises 8,529,090 individual articles, classified into 12 classes: reliable, unreliable, political, bias, fake, conspiracy,\nrumor clickbait, junk science, satire, hate and unknown. The articles were scraped between the end of 2017 and the beginning of 2018 from various\nnews websites, totaling 647 distinct sources, collecting articles dating from various years leading to the 2016 US elections and the year after.\nDocuments were classified based on their source, based on the curated website list provided by URL using a leading to a\nhigh imbalanced class distribution. Their proposed source classification method, was based on six criteria:\n\n\n* Title and Domain name analysis,\n* “About Us” analysis,\n* source or study mentioning,\n* writing style analysis,\n* aesthetic analysis and social media analysis.\n\n\nAfter extensive data cleaning and duplicate removal we retain 5,915,569 records",
"### Languages\n\n\nEnglish\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nAn example record looks as follows.",
"### Data Fields\n\n\n* 'id': The unique article ID\n* 'type': the label of the record (one of: reliable, unreliable, political, bias, fake, conspiracy,\nrumor clickbait, junk science, satire, hate)\n* 'scraped\\_at': date of the original scrape run\n* 'url': original article url\n* 'authors': comma separated list of scraped authors\n* 'title': original scraped article title\n* 'content': full article text",
"### Data Splits\n\n\n\nDataset Creation\n----------------",
"### Source Data\n\n\nNews Articles from various sites",
"#### Who are the source language producers?\n\n\nNews Articles, Blogs",
"### Annotations",
"#### Who are the annotators?\n\n\nJournalists",
"### Other Known Limitations\n\n\nThe dataset was not manually filtered, therefore some of the labels might not be correct and some of the URLs might not point to the actual articles but other pages on the website. However, because the corpus is intended for use in training machine learning algorithms, those problems should not pose a practical issue.\n\n\nAdditionally, when the dataset will be finalised (as for now only about 80% was cleaned and published), I do not intend to update it, therefore it might quickly become outdated for other purposes than content-based algorithms. However, any contributions are welcome!",
"### Licensing Information\n\n\nThis data is available and distributed under Apache-2.0 license"
] |
[
"TAGS\n#task_categories-text-classification #task_ids-topic-classification #task_ids-fact-checking #annotations_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-1M<n<10M #source_datasets-Opensources https-//github.com/BigMcLargeHuge/opensources #source_datasets-FakeNews Corpus https-//github.com/several27/FakeNewsCorpus #language-English #license-apache-2.0 #fake-news-detection #fake news #english #nlp #region-us \n",
"### Dataset Summary\n\n\na consolidated and cleaned up version of the opensources Fake News dataset\n\n\nFake News Corpus comprises 8,529,090 individual articles, classified into 12 classes: reliable, unreliable, political, bias, fake, conspiracy,\nrumor clickbait, junk science, satire, hate and unknown. The articles were scraped between the end of 2017 and the beginning of 2018 from various\nnews websites, totaling 647 distinct sources, collecting articles dating from various years leading to the 2016 US elections and the year after.\nDocuments were classified based on their source, based on the curated website list provided by URL using a leading to a\nhigh imbalanced class distribution. Their proposed source classification method, was based on six criteria:\n\n\n* Title and Domain name analysis,\n* “About Us” analysis,\n* source or study mentioning,\n* writing style analysis,\n* aesthetic analysis and social media analysis.\n\n\nAfter extensive data cleaning and duplicate removal we retain 5,915,569 records",
"### Languages\n\n\nEnglish\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nAn example record looks as follows.",
"### Data Fields\n\n\n* 'id': The unique article ID\n* 'type': the label of the record (one of: reliable, unreliable, political, bias, fake, conspiracy,\nrumor clickbait, junk science, satire, hate)\n* 'scraped\\_at': date of the original scrape run\n* 'url': original article url\n* 'authors': comma separated list of scraped authors\n* 'title': original scraped article title\n* 'content': full article text",
"### Data Splits\n\n\n\nDataset Creation\n----------------",
"### Source Data\n\n\nNews Articles from various sites",
"#### Who are the source language producers?\n\n\nNews Articles, Blogs",
"### Annotations",
"#### Who are the annotators?\n\n\nJournalists",
"### Other Known Limitations\n\n\nThe dataset was not manually filtered, therefore some of the labels might not be correct and some of the URLs might not point to the actual articles but other pages on the website. However, because the corpus is intended for use in training machine learning algorithms, those problems should not pose a practical issue.\n\n\nAdditionally, when the dataset will be finalised (as for now only about 80% was cleaned and published), I do not intend to update it, therefore it might quickly become outdated for other purposes than content-based algorithms. However, any contributions are welcome!",
"### Licensing Information\n\n\nThis data is available and distributed under Apache-2.0 license"
] |
[
165,
228,
12,
14,
118,
11,
10,
16,
5,
11,
133,
19
] |
[
"passage: TAGS\n#task_categories-text-classification #task_ids-topic-classification #task_ids-fact-checking #annotations_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-1M<n<10M #source_datasets-Opensources https-//github.com/BigMcLargeHuge/opensources #source_datasets-FakeNews Corpus https-//github.com/several27/FakeNewsCorpus #language-English #license-apache-2.0 #fake-news-detection #fake news #english #nlp #region-us \n### Dataset Summary\n\n\na consolidated and cleaned up version of the opensources Fake News dataset\n\n\nFake News Corpus comprises 8,529,090 individual articles, classified into 12 classes: reliable, unreliable, political, bias, fake, conspiracy,\nrumor clickbait, junk science, satire, hate and unknown. The articles were scraped between the end of 2017 and the beginning of 2018 from various\nnews websites, totaling 647 distinct sources, collecting articles dating from various years leading to the 2016 US elections and the year after.\nDocuments were classified based on their source, based on the curated website list provided by URL using a leading to a\nhigh imbalanced class distribution. Their proposed source classification method, was based on six criteria:\n\n\n* Title and Domain name analysis,\n* “About Us” analysis,\n* source or study mentioning,\n* writing style analysis,\n* aesthetic analysis and social media analysis.\n\n\nAfter extensive data cleaning and duplicate removal we retain 5,915,569 records### Languages\n\n\nEnglish\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nAn example record looks as follows."
] |
206decfd473166fe76609bc7931dbe4c9af153a4
|
# Dataset Card for Evaluation run of Rallio67/3B-redpajama-conditional-alpha
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Rallio67/3B-redpajama-conditional-alpha
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Rallio67/3B-redpajama-conditional-alpha](https://huggingface.co/Rallio67/3B-redpajama-conditional-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Rallio67__3B-redpajama-conditional-alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T09:28:50.380809](https://huggingface.co/datasets/open-llm-leaderboard/details_Rallio67__3B-redpajama-conditional-alpha/blob/main/results_2023-10-28T09-28-50.380809.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511201,
"f1": 0.052849203020134294,
"f1_stderr": 0.00137566520570553,
"acc": 0.3069000037698072,
"acc_stderr": 0.007930535381730674
},
"harness|drop|3": {
"em": 0.0036703020134228187,
"em_stderr": 0.0006192871806511201,
"f1": 0.052849203020134294,
"f1_stderr": 0.00137566520570553
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.002138670301460461
},
"harness|winogrande|5": {
"acc": 0.6077348066298343,
"acc_stderr": 0.013722400462000888
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Rallio67__3B-redpajama-conditional-alpha
|
[
"region:us"
] |
2023-09-18T13:37:15+00:00
|
{"pretty_name": "Evaluation run of Rallio67/3B-redpajama-conditional-alpha", "dataset_summary": "Dataset automatically created during the evaluation run of model [Rallio67/3B-redpajama-conditional-alpha](https://huggingface.co/Rallio67/3B-redpajama-conditional-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Rallio67__3B-redpajama-conditional-alpha\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T09:28:50.380809](https://huggingface.co/datasets/open-llm-leaderboard/details_Rallio67__3B-redpajama-conditional-alpha/blob/main/results_2023-10-28T09-28-50.380809.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511201,\n \"f1\": 0.052849203020134294,\n \"f1_stderr\": 0.00137566520570553,\n \"acc\": 0.3069000037698072,\n \"acc_stderr\": 0.007930535381730674\n },\n \"harness|drop|3\": {\n \"em\": 0.0036703020134228187,\n \"em_stderr\": 0.0006192871806511201,\n \"f1\": 0.052849203020134294,\n \"f1_stderr\": 0.00137566520570553\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \"acc_stderr\": 0.002138670301460461\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6077348066298343,\n \"acc_stderr\": 0.013722400462000888\n }\n}\n```", "repo_url": "https://huggingface.co/Rallio67/3B-redpajama-conditional-alpha", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T09_28_50.380809", "path": ["**/details_harness|drop|3_2023-10-28T09-28-50.380809.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T09-28-50.380809.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T09_28_50.380809", "path": ["**/details_harness|gsm8k|5_2023-10-28T09-28-50.380809.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T09-28-50.380809.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-36-57.601576.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-36-57.601576.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-36-57.601576.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T09_28_50.380809", "path": ["**/details_harness|winogrande|5_2023-10-28T09-28-50.380809.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T09-28-50.380809.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T14_36_57.601576", "path": ["results_2023-09-18T14-36-57.601576.parquet"]}, {"split": "2023_10_28T09_28_50.380809", "path": ["results_2023-10-28T09-28-50.380809.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T09-28-50.380809.parquet"]}]}]}
|
2023-10-28T08:29:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Rallio67/3B-redpajama-conditional-alpha
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Rallio67/3B-redpajama-conditional-alpha on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-28T09:28:50.380809(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Rallio67/3B-redpajama-conditional-alpha",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Rallio67/3B-redpajama-conditional-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T09:28:50.380809(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Rallio67/3B-redpajama-conditional-alpha",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Rallio67/3B-redpajama-conditional-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T09:28:50.380809(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Rallio67/3B-redpajama-conditional-alpha## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Rallio67/3B-redpajama-conditional-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T09:28:50.380809(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
ffa0fc7bbd8e852a308388ad943c950eb5c52bda
|
# Dataset Card for Evaluation run of FelixChao/CodeLlama13B-Finetune-v1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/FelixChao/CodeLlama13B-Finetune-v1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [FelixChao/CodeLlama13B-Finetune-v1](https://huggingface.co/FelixChao/CodeLlama13B-Finetune-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FelixChao__CodeLlama13B-Finetune-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T05:29:46.702579](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__CodeLlama13B-Finetune-v1/blob/main/results_2023-10-24T05-29-46.702579.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298562,
"f1": 0.055318791946309044,
"f1_stderr": 0.0012876698767673924,
"acc": 0.3896146598826094,
"acc_stderr": 0.010919315737434055
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.0003921042190298562,
"f1": 0.055318791946309044,
"f1_stderr": 0.0012876698767673924
},
"harness|gsm8k|5": {
"acc": 0.10993176648976498,
"acc_stderr": 0.008616195587865414
},
"harness|winogrande|5": {
"acc": 0.6692975532754538,
"acc_stderr": 0.013222435887002696
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_FelixChao__CodeLlama13B-Finetune-v1
|
[
"region:us"
] |
2023-09-18T13:42:39+00:00
|
{"pretty_name": "Evaluation run of FelixChao/CodeLlama13B-Finetune-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [FelixChao/CodeLlama13B-Finetune-v1](https://huggingface.co/FelixChao/CodeLlama13B-Finetune-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FelixChao__CodeLlama13B-Finetune-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T05:29:46.702579](https://huggingface.co/datasets/open-llm-leaderboard/details_FelixChao__CodeLlama13B-Finetune-v1/blob/main/results_2023-10-24T05-29-46.702579.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298562,\n \"f1\": 0.055318791946309044,\n \"f1_stderr\": 0.0012876698767673924,\n \"acc\": 0.3896146598826094,\n \"acc_stderr\": 0.010919315737434055\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.0003921042190298562,\n \"f1\": 0.055318791946309044,\n \"f1_stderr\": 0.0012876698767673924\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10993176648976498,\n \"acc_stderr\": 0.008616195587865414\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6692975532754538,\n \"acc_stderr\": 0.013222435887002696\n }\n}\n```", "repo_url": "https://huggingface.co/FelixChao/CodeLlama13B-Finetune-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T05_29_46.702579", "path": ["**/details_harness|drop|3_2023-10-24T05-29-46.702579.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T05-29-46.702579.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T05_29_46.702579", "path": ["**/details_harness|gsm8k|5_2023-10-24T05-29-46.702579.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T05-29-46.702579.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-42-15.580779.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-42-15.580779.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-42-15.580779.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T05_29_46.702579", "path": ["**/details_harness|winogrande|5_2023-10-24T05-29-46.702579.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T05-29-46.702579.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T14_42_15.580779", "path": ["results_2023-09-18T14-42-15.580779.parquet"]}, {"split": "2023_10_24T05_29_46.702579", "path": ["results_2023-10-24T05-29-46.702579.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T05-29-46.702579.parquet"]}]}]}
|
2023-10-24T04:29:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of FelixChao/CodeLlama13B-Finetune-v1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model FelixChao/CodeLlama13B-Finetune-v1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T05:29:46.702579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of FelixChao/CodeLlama13B-Finetune-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/CodeLlama13B-Finetune-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T05:29:46.702579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FelixChao/CodeLlama13B-Finetune-v1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/CodeLlama13B-Finetune-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T05:29:46.702579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FelixChao/CodeLlama13B-Finetune-v1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model FelixChao/CodeLlama13B-Finetune-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T05:29:46.702579(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b7db31514f7b2fd8feb5eac46f27ffcf13fa2ee5
|
# Dataset Card for "qa_wikipedia_sentence_transformer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
legacy107/qa_wikipedia_sentence_transformer
|
[
"region:us"
] |
2023-09-18T13:44:24+00:00
|
{"dataset_info": {"features": [{"name": "anchor", "dtype": "string"}, {"name": "negative", "dtype": "string"}, {"name": "positive", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 31856811, "num_examples": 29965}, {"name": "validation", "num_bytes": 3167027, "num_examples": 3000}, {"name": "test", "num_bytes": 3103240, "num_examples": 2981}], "download_size": 2854716, "dataset_size": 38127078}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}]}
|
2023-09-23T01:32:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "qa_wikipedia_sentence_transformer"
More Information needed
|
[
"# Dataset Card for \"qa_wikipedia_sentence_transformer\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"qa_wikipedia_sentence_transformer\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"qa_wikipedia_sentence_transformer\"\n\nMore Information needed"
] |
3cf218bcc1abfd13680bedb40115cb7b173ac61c
|
# Dataset Card for Evaluation run of Rallio67/7B-redpajama-conditional-alpha
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Rallio67/7B-redpajama-conditional-alpha
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Rallio67/7B-redpajama-conditional-alpha](https://huggingface.co/Rallio67/7B-redpajama-conditional-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Rallio67__7B-redpajama-conditional-alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T09:32:05.345460](https://huggingface.co/datasets/open-llm-leaderboard/details_Rallio67__7B-redpajama-conditional-alpha/blob/main/results_2023-10-26T09-32-05.345460.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.00031446531194129435,
"f1": 0.04856229026845656,
"f1_stderr": 0.0012026937489831246,
"acc": 0.33962342618029373,
"acc_stderr": 0.0077937904808975484
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.00031446531194129435,
"f1": 0.04856229026845656,
"f1_stderr": 0.0012026937489831246
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.0023892815120772123
},
"harness|winogrande|5": {
"acc": 0.6716653512233622,
"acc_stderr": 0.013198299449717885
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Rallio67__7B-redpajama-conditional-alpha
|
[
"region:us"
] |
2023-09-18T13:45:43+00:00
|
{"pretty_name": "Evaluation run of Rallio67/7B-redpajama-conditional-alpha", "dataset_summary": "Dataset automatically created during the evaluation run of model [Rallio67/7B-redpajama-conditional-alpha](https://huggingface.co/Rallio67/7B-redpajama-conditional-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Rallio67__7B-redpajama-conditional-alpha\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-26T09:32:05.345460](https://huggingface.co/datasets/open-llm-leaderboard/details_Rallio67__7B-redpajama-conditional-alpha/blob/main/results_2023-10-26T09-32-05.345460.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.00031446531194129435,\n \"f1\": 0.04856229026845656,\n \"f1_stderr\": 0.0012026937489831246,\n \"acc\": 0.33962342618029373,\n \"acc_stderr\": 0.0077937904808975484\n },\n \"harness|drop|3\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.00031446531194129435,\n \"f1\": 0.04856229026845656,\n \"f1_stderr\": 0.0012026937489831246\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.0023892815120772123\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6716653512233622,\n \"acc_stderr\": 0.013198299449717885\n }\n}\n```", "repo_url": "https://huggingface.co/Rallio67/7B-redpajama-conditional-alpha", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_26T09_32_05.345460", "path": ["**/details_harness|drop|3_2023-10-26T09-32-05.345460.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-26T09-32-05.345460.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_26T09_32_05.345460", "path": ["**/details_harness|gsm8k|5_2023-10-26T09-32-05.345460.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-26T09-32-05.345460.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-45-25.410527.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-45-25.410527.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-45-25.410527.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_26T09_32_05.345460", "path": ["**/details_harness|winogrande|5_2023-10-26T09-32-05.345460.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-26T09-32-05.345460.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T14_45_25.410527", "path": ["results_2023-09-18T14-45-25.410527.parquet"]}, {"split": "2023_10_26T09_32_05.345460", "path": ["results_2023-10-26T09-32-05.345460.parquet"]}, {"split": "latest", "path": ["results_2023-10-26T09-32-05.345460.parquet"]}]}]}
|
2023-10-26T08:32:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Rallio67/7B-redpajama-conditional-alpha
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Rallio67/7B-redpajama-conditional-alpha on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-26T09:32:05.345460(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Rallio67/7B-redpajama-conditional-alpha",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Rallio67/7B-redpajama-conditional-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-26T09:32:05.345460(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Rallio67/7B-redpajama-conditional-alpha",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Rallio67/7B-redpajama-conditional-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-26T09:32:05.345460(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Rallio67/7B-redpajama-conditional-alpha## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Rallio67/7B-redpajama-conditional-alpha on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-26T09:32:05.345460(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
0eb00314de90e04adb610e9826baa074fab7b6bb
|
# Dataset Card for Evaluation run of KnutJaegersberg/deacon-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/deacon-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/deacon-3b](https://huggingface.co/KnutJaegersberg/deacon-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__deacon-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T13:23:04.115502](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__deacon-3b/blob/main/results_2023-10-28T13-23-04.115502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413187,
"f1": 0.05062919463087265,
"f1_stderr": 0.0012970020903289405,
"acc": 0.32509979517380905,
"acc_stderr": 0.007564621001375068
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.0003144653119413187,
"f1": 0.05062919463087265,
"f1_stderr": 0.0012970020903289405
},
"harness|gsm8k|5": {
"acc": 0.0037907505686125853,
"acc_stderr": 0.0016927007401501847
},
"harness|winogrande|5": {
"acc": 0.6464088397790055,
"acc_stderr": 0.013436541262599952
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_KnutJaegersberg__deacon-3b
|
[
"region:us"
] |
2023-09-18T13:48:05+00:00
|
{"pretty_name": "Evaluation run of KnutJaegersberg/deacon-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/deacon-3b](https://huggingface.co/KnutJaegersberg/deacon-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__deacon-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-28T13:23:04.115502](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__deacon-3b/blob/main/results_2023-10-28T13-23-04.115502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413187,\n \"f1\": 0.05062919463087265,\n \"f1_stderr\": 0.0012970020903289405,\n \"acc\": 0.32509979517380905,\n \"acc_stderr\": 0.007564621001375068\n },\n \"harness|drop|3\": {\n \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.0003144653119413187,\n \"f1\": 0.05062919463087265,\n \"f1_stderr\": 0.0012970020903289405\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0037907505686125853,\n \"acc_stderr\": 0.0016927007401501847\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6464088397790055,\n \"acc_stderr\": 0.013436541262599952\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/deacon-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_28T13_23_04.115502", "path": ["**/details_harness|drop|3_2023-10-28T13-23-04.115502.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T13-23-04.115502.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_28T13_23_04.115502", "path": ["**/details_harness|gsm8k|5_2023-10-28T13-23-04.115502.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-28T13-23-04.115502.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-47-42.541004.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T14-47-42.541004.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_28T13_23_04.115502", "path": ["**/details_harness|winogrande|5_2023-10-28T13-23-04.115502.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T13-23-04.115502.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T14_47_42.541004", "path": ["results_2023-09-18T14-47-42.541004.parquet"]}, {"split": "2023_10_28T13_23_04.115502", "path": ["results_2023-10-28T13-23-04.115502.parquet"]}, {"split": "latest", "path": ["results_2023-10-28T13-23-04.115502.parquet"]}]}]}
|
2023-10-28T12:23:16+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/deacon-3b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model KnutJaegersberg/deacon-3b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-28T13:23:04.115502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of KnutJaegersberg/deacon-3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/deacon-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T13:23:04.115502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/deacon-3b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/deacon-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-28T13:23:04.115502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/deacon-3b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/deacon-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-28T13:23:04.115502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b82d933b316872be9fd051f9c46783b84303103a
|
# Dataset Card for "indian_food_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
utkarshhh17/indian_food_images
|
[
"region:us"
] |
2023-09-18T14:00:27+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "burger", "1": "butter_naan", "2": "chai", "3": "chapati", "4": "chole_bhature", "5": "dal_makhani", "6": "dhokla", "7": "fried_rice", "8": "idli", "9": "jalebi", "10": "kaathi_rolls", "11": "kadai_paneer", "12": "kulfi", "13": "masala_dosa", "14": "momos", "15": "paani_puri", "16": "pakode", "17": "pav_bhaji", "18": "pizza", "19": "samosa"}}}}], "splits": [{"name": "train", "num_bytes": 1697830157.4234333, "num_examples": 5328}, {"name": "test", "num_bytes": 249679569.3925666, "num_examples": 941}], "download_size": 1601513193, "dataset_size": 1947509726.816}}
|
2023-09-19T10:49:04+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "indian_food_images"
More Information needed
|
[
"# Dataset Card for \"indian_food_images\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"indian_food_images\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"indian_food_images\"\n\nMore Information needed"
] |
9c5131b0830f49dd42b76f7438a18309156f7069
|
# Dataset Card for "sg32"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
aimona/sg32
|
[
"region:us"
] |
2023-09-18T14:01:51+00:00
|
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 4073025.0, "num_examples": 32}], "download_size": 0, "dataset_size": 4073025.0}}
|
2023-09-19T14:25:51+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "sg32"
More Information needed
|
[
"# Dataset Card for \"sg32\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"sg32\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"sg32\"\n\nMore Information needed"
] |
f0a1d10ac8e1b92cd7ac08f5316c350749417681
|
# Dataset Card for "climate-world-region"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
vitaliy-sharandin/climate-world-region
|
[
"region:us"
] |
2023-09-18T14:10:54+00:00
|
{"dataset_info": {"features": [{"name": "Entity", "dtype": "string"}, {"name": "Seasonal variation", "dtype": "float64"}, {"name": "Combined measurements", "dtype": "float64"}, {"name": "Monthly averaged", "dtype": "float64"}, {"name": "Annual averaged", "dtype": "float64"}, {"name": "monthly_sea_surface_temperature_anomaly", "dtype": "float64"}, {"name": "Sea surface temp (lower-bound)", "dtype": "float64"}, {"name": "Sea surface temp (upper-bound)", "dtype": "float64"}, {"name": "Monthly pH measurement", "dtype": "float64"}, {"name": "Annual average", "dtype": "float64"}, {"name": "Temperature anomaly", "dtype": "float64"}, {"name": "Church & White", "dtype": "float64"}, {"name": "University of Hawaii", "dtype": "float64"}, {"name": "Average", "dtype": "float64"}, {"name": "arctic_sea_ice_osisaf", "dtype": "float64"}, {"name": "Monthly averaged.1", "dtype": "float64"}, {"name": "Annual averaged.1", "dtype": "float64"}, {"name": "Monthly averaged.2", "dtype": "float64"}, {"name": "Annual averaged.2", "dtype": "float64"}, {"name": "Date", "dtype": "timestamp[ns, tz=UTC]"}, {"name": "dt", "dtype": "timestamp[ns, tz=UTC]"}], "splits": [{"name": "train", "num_bytes": 1813733, "num_examples": 10198}], "download_size": 450942, "dataset_size": 1813733}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-20T15:05:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "climate-world-region"
More Information needed
|
[
"# Dataset Card for \"climate-world-region\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"climate-world-region\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"climate-world-region\"\n\nMore Information needed"
] |
d855e2b52c59ed405433d8b31a6b9df35f4d1175
|
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/euclaise/falcon_1b_stage1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage1](https://huggingface.co/euclaise/falcon_1b_stage1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_euclaise__falcon_1b_stage1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T08:48:09.211472](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage1/blob/main/results_2023-10-24T08-48-09.211472.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964606464,
"f1": 0.05891988255033575,
"f1_stderr": 0.0012949851468038905,
"acc": 0.3074191002367798,
"acc_stderr": 0.006838410643760707
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964606464,
"f1": 0.05891988255033575,
"f1_stderr": 0.0012949851468038905
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.6148382004735596,
"acc_stderr": 0.013676821287521413
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_euclaise__falcon_1b_stage1
|
[
"region:us"
] |
2023-09-18T14:14:38+00:00
|
{"pretty_name": "Evaluation run of euclaise/falcon_1b_stage1", "dataset_summary": "Dataset automatically created during the evaluation run of model [euclaise/falcon_1b_stage1](https://huggingface.co/euclaise/falcon_1b_stage1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_euclaise__falcon_1b_stage1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T08:48:09.211472](https://huggingface.co/datasets/open-llm-leaderboard/details_euclaise__falcon_1b_stage1/blob/main/results_2023-10-24T08-48-09.211472.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606464,\n \"f1\": 0.05891988255033575,\n \"f1_stderr\": 0.0012949851468038905,\n \"acc\": 0.3074191002367798,\n \"acc_stderr\": 0.006838410643760707\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606464,\n \"f1\": 0.05891988255033575,\n \"f1_stderr\": 0.0012949851468038905\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6148382004735596,\n \"acc_stderr\": 0.013676821287521413\n }\n}\n```", "repo_url": "https://huggingface.co/euclaise/falcon_1b_stage1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|arc:challenge|25_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_24T08_48_09.211472", "path": ["**/details_harness|drop|3_2023-10-24T08-48-09.211472.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T08-48-09.211472.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_24T08_48_09.211472", "path": ["**/details_harness|gsm8k|5_2023-10-24T08-48-09.211472.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T08-48-09.211472.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hellaswag|10_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T15-14-21.518286.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T15-14-21.518286.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T15-14-21.518286.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_24T08_48_09.211472", "path": ["**/details_harness|winogrande|5_2023-10-24T08-48-09.211472.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T08-48-09.211472.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T15_14_21.518286", "path": ["results_2023-09-18T15-14-21.518286.parquet"]}, {"split": "2023_10_24T08_48_09.211472", "path": ["results_2023-10-24T08-48-09.211472.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T08-48-09.211472.parquet"]}]}]}
|
2023-10-24T07:48:21+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of euclaise/falcon_1b_stage1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model euclaise/falcon_1b_stage1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T08:48:09.211472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of euclaise/falcon_1b_stage1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/falcon_1b_stage1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T08:48:09.211472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of euclaise/falcon_1b_stage1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/falcon_1b_stage1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T08:48:09.211472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of euclaise/falcon_1b_stage1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model euclaise/falcon_1b_stage1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T08:48:09.211472(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
091f149c5e9ac460ad756bfac381809f9fd38f82
|
# Dataset Card for "pollution-by-region"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
vitaliy-sharandin/pollution-by-region
|
[
"region:us"
] |
2023-09-18T14:17:42+00:00
|
{"dataset_info": {"features": [{"name": "Entity", "dtype": "string"}, {"name": "Code", "dtype": "string"}, {"name": "Annual CO\u2082 emissions", "dtype": "float64"}, {"name": "Year", "dtype": "timestamp[ns, tz=UTC]"}, {"name": "dt", "dtype": "timestamp[ns, tz=UTC]"}], "splits": [{"name": "train", "num_bytes": 1409806, "num_examples": 31349}], "download_size": 395907, "dataset_size": 1409806}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-20T15:05:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "pollution-by-region"
More Information needed
|
[
"# Dataset Card for \"pollution-by-region\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"pollution-by-region\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"pollution-by-region\"\n\nMore Information needed"
] |
0b18605d92bd11ecaabbb289d6b3564ae2c45c60
|
# Dataset Card for "pollution-absolute-variation-co2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
vitaliy-sharandin/pollution-absolute-variation-co2
|
[
"region:us"
] |
2023-09-18T14:19:57+00:00
|
{"dataset_info": {"features": [{"name": "Entity", "dtype": "string"}, {"name": "Code", "dtype": "string"}, {"name": "Annual CO\u2082 emissions growth (abs)", "dtype": "float64"}, {"name": "Year", "dtype": "timestamp[ns, tz=UTC]"}, {"name": "dt", "dtype": "timestamp[ns, tz=UTC]"}], "splits": [{"name": "train", "num_bytes": 1295730, "num_examples": 28944}], "download_size": 350866, "dataset_size": 1295730}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-20T15:05:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "pollution-absolute-variation-co2"
More Information needed
|
[
"# Dataset Card for \"pollution-absolute-variation-co2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"pollution-absolute-variation-co2\"\n\nMore Information needed"
] |
[
6,
23
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"pollution-absolute-variation-co2\"\n\nMore Information needed"
] |
6411980d6c92042e8b88e50ce97010bc11bc5fa0
|
# Dataset Card for "b2489367"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
result-kand2-sdxl-wuerst-karlo/b2489367
|
[
"region:us"
] |
2023-09-18T14:20:17+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 254, "num_examples": 10}], "download_size": 1431, "dataset_size": 254}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T14:20:18+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "b2489367"
More Information needed
|
[
"# Dataset Card for \"b2489367\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"b2489367\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"b2489367\"\n\nMore Information needed"
] |
58d7cf527b896a35d38211988c996c690bc9830d
|
# EMoTES-3K
EMoTES-3K is the dataset featured in a research article by [Catapang and Visperas (2023)](https://aclanthology.org/2023.nlp4dh-1.1/) about moral reasoning. This English-Filipino parallel corpus contains moral judgments and explanations of various day-to-day scenarios.
BibTeX entry and citation info
```bibtex
@inproceedings{catapang-visperas-2023-emotion,
title = "Emotion-based Morality in {T}agalog and {E}nglish Scenarios ({EM}o{TES}-3{K}): A Parallel Corpus for Explaining (Im)morality of Actions",
author = "Catapang, Jasper Kyle and
Visperas, Moses",
booktitle = "Proceedings of the Joint 3rd International Conference on Natural Language Processing for Digital Humanities and 8th International Workshop on Computational Linguistics for Uralic Languages",
month = dec,
year = "2023",
address = "Tokyo, Japan",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.nlp4dh-1.1",
pages = "1--6",
}
```
|
NLPinas/EMoTES-3K
|
[
"license:apache-2.0",
"region:us"
] |
2023-09-18T14:34:18+00:00
|
{"license": "apache-2.0"}
|
2024-02-07T15:31:29+00:00
|
[] |
[] |
TAGS
#license-apache-2.0 #region-us
|
# EMoTES-3K
EMoTES-3K is the dataset featured in a research article by Catapang and Visperas (2023) about moral reasoning. This English-Filipino parallel corpus contains moral judgments and explanations of various day-to-day scenarios.
BibTeX entry and citation info
|
[
"# EMoTES-3K\n\nEMoTES-3K is the dataset featured in a research article by Catapang and Visperas (2023) about moral reasoning. This English-Filipino parallel corpus contains moral judgments and explanations of various day-to-day scenarios.\n\nBibTeX entry and citation info"
] |
[
"TAGS\n#license-apache-2.0 #region-us \n",
"# EMoTES-3K\n\nEMoTES-3K is the dataset featured in a research article by Catapang and Visperas (2023) about moral reasoning. This English-Filipino parallel corpus contains moral judgments and explanations of various day-to-day scenarios.\n\nBibTeX entry and citation info"
] |
[
14,
70
] |
[
"passage: TAGS\n#license-apache-2.0 #region-us \n# EMoTES-3K\n\nEMoTES-3K is the dataset featured in a research article by Catapang and Visperas (2023) about moral reasoning. This English-Filipino parallel corpus contains moral judgments and explanations of various day-to-day scenarios.\n\nBibTeX entry and citation info"
] |
4d517821970f28dac6c5be78629368caf02d3d82
|
# Dataset Card for "story_1_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Falah/story_1_prompts
|
[
"region:us"
] |
2023-09-18T14:36:17+00:00
|
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3199, "num_examples": 10}], "download_size": 4429, "dataset_size": 3199}}
|
2023-09-23T09:18:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "story_1_prompts"
More Information needed
|
[
"# Dataset Card for \"story_1_prompts\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"story_1_prompts\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"story_1_prompts\"\n\nMore Information needed"
] |
529091b77379696297dc4e2c1b700c9413dd7920
|
# Dataset Card for "story_2_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Falah/story_2_prompts
|
[
"region:us"
] |
2023-09-18T14:36:21+00:00
|
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2575, "num_examples": 3}], "download_size": 8655, "dataset_size": 2575}}
|
2023-09-23T09:18:20+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "story_2_prompts"
More Information needed
|
[
"# Dataset Card for \"story_2_prompts\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"story_2_prompts\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"story_2_prompts\"\n\nMore Information needed"
] |
be53b72ab06124ea2e1dd4a2758737a3b6927a55
|
# Dataset Card for "llm_training_marketing"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Dugald/llm_training_marketing
|
[
"region:us"
] |
2023-09-18T14:50:03+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20038, "num_examples": 10}], "download_size": 26247, "dataset_size": 20038}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T14:50:04+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "llm_training_marketing"
More Information needed
|
[
"# Dataset Card for \"llm_training_marketing\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"llm_training_marketing\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"llm_training_marketing\"\n\nMore Information needed"
] |
42ea3ca7960ac979f1f2a49d8d62d0d5190a289b
|
# Dataset Card for "opus100-en-fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
manu/opus100-en-fr
|
[
"region:us"
] |
2023-09-18T15:15:15+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 479723, "num_examples": 2000}, {"name": "train", "num_bytes": 206440450, "num_examples": 1000000}, {"name": "validation", "num_bytes": 491476, "num_examples": 2000}], "download_size": 148902270, "dataset_size": 207411649}}
|
2023-09-18T15:15:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "opus100-en-fr"
More Information needed
|
[
"# Dataset Card for \"opus100-en-fr\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"opus100-en-fr\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"opus100-en-fr\"\n\nMore Information needed"
] |
7feb21cfd22d2185459975d8b0065f4af670de8f
|
# Dataset Card for "temp-perplexities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
usvsnsp/temp-perplexities
|
[
"region:us"
] |
2023-09-18T15:24:54+00:00
|
{"dataset_info": {"features": [{"name": "index", "dtype": "int32"}, {"name": "loss", "dtype": "float32"}, {"name": "prompt_perplexity", "dtype": "float32"}, {"name": "generation_perplexity", "dtype": "float32"}, {"name": "sequence_perplexity", "dtype": "float32"}], "splits": [{"name": "pile.duped.6.9b", "num_bytes": 100000000, "num_examples": 5000000}, {"name": "memories.duped.6.9b", "num_bytes": 42419520, "num_examples": 2120976}, {"name": "memories.duped.12b", "num_bytes": 47646560, "num_examples": 2382328}, {"name": "memories.deduped.12b", "num_bytes": 37424320, "num_examples": 1871216}], "download_size": 255603395, "dataset_size": 227490400}}
|
2023-09-18T15:25:13+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "temp-perplexities"
More Information needed
|
[
"# Dataset Card for \"temp-perplexities\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"temp-perplexities\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"temp-perplexities\"\n\nMore Information needed"
] |
d7e9029ab951f8e8f6408084375eab7acc7159e9
|
# Dataset Card for `backsum`
## Licensing
This dataset was derived from the [Scisumm Corpus](https://github.com/WING-NUS/scisumm-corpus).
If you use this data, please cite the original CL-SciSumm overview paper:
```
@inproceedings{
title = {Overview and Results: CL-SciSumm Shared Task 2019},
author = {Chandrasekaran, Muthu Kumar and Yasunaga, Michihiro and Radev, Dragomir and Freitag, Dayne and Kan, Min-Yen},
year = 2019,
booktitle = {In Proceedings of Joint Workshop on Bibliometric-enhanced Information Retrieval and NLP for Digital Libraries (BIRNDL 2019)}
}
```
|
lguenth/backsum
|
[
"language:en",
"license:cc-by-4.0",
"region:us"
] |
2023-09-18T15:27:29+00:00
|
{"language": ["en"], "license": "cc-by-4.0", "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "train.jsonl"}, {"split": "test", "path": "test.jsonl"}]}]}
|
2023-09-26T13:20:27+00:00
|
[] |
[
"en"
] |
TAGS
#language-English #license-cc-by-4.0 #region-us
|
# Dataset Card for 'backsum'
## Licensing
This dataset was derived from the Scisumm Corpus.
If you use this data, please cite the original CL-SciSumm overview paper:
|
[
"# Dataset Card for 'backsum'",
"## Licensing\n\nThis dataset was derived from the Scisumm Corpus.\n\nIf you use this data, please cite the original CL-SciSumm overview paper:"
] |
[
"TAGS\n#language-English #license-cc-by-4.0 #region-us \n",
"# Dataset Card for 'backsum'",
"## Licensing\n\nThis dataset was derived from the Scisumm Corpus.\n\nIf you use this data, please cite the original CL-SciSumm overview paper:"
] |
[
19,
9,
37
] |
[
"passage: TAGS\n#language-English #license-cc-by-4.0 #region-us \n# Dataset Card for 'backsum'## Licensing\n\nThis dataset was derived from the Scisumm Corpus.\n\nIf you use this data, please cite the original CL-SciSumm overview paper:"
] |
2a694473eaff1e5de8027d4863612332404822fc
|
# Dataset Card for "vhac_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
nguyenthanhdo/vhac_v2
|
[
"region:us"
] |
2023-09-18T15:35:23+00:00
|
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 346229589, "num_examples": 108658}], "download_size": 163968580, "dataset_size": 346229589}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T15:35:46+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "vhac_v2"
More Information needed
|
[
"# Dataset Card for \"vhac_v2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"vhac_v2\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"vhac_v2\"\n\nMore Information needed"
] |
108e5a3e7c1a392406acf36984bff75420c87dcb
|
# Dataset Card for "europarl-en-fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
manu/europarl-en-fr
|
[
"region:us"
] |
2023-09-18T15:41:12+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "int64"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 685175635, "num_examples": 2051014}], "download_size": 413609385, "dataset_size": 685175635}}
|
2023-09-18T15:41:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "europarl-en-fr"
More Information needed
|
[
"# Dataset Card for \"europarl-en-fr\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"europarl-en-fr\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"europarl-en-fr\"\n\nMore Information needed"
] |
9a93e53017350a0bfc16493f67b12dc91db99fac
|
# Dataset Card for "vhac_v2_chai_format"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
nguyenthanhdo/vhac_v2_chai_format
|
[
"region:us"
] |
2023-09-18T15:41:52+00:00
|
{"dataset_info": {"features": [{"name": "model_input", "dtype": "string"}, {"name": "model_output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 369591059.0, "num_examples": 108658}], "download_size": 177238172, "dataset_size": 369591059.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T15:42:20+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "vhac_v2_chai_format"
More Information needed
|
[
"# Dataset Card for \"vhac_v2_chai_format\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"vhac_v2_chai_format\"\n\nMore Information needed"
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"vhac_v2_chai_format\"\n\nMore Information needed"
] |
04f3766f754940170fd2e640b49f1966719a713e
|
# Dataset Card for "9bc865b4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
result-kand2-sdxl-wuerst-karlo/9bc865b4
|
[
"region:us"
] |
2023-09-18T16:00:58+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 188, "num_examples": 10}], "download_size": 1354, "dataset_size": 188}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T16:00:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "9bc865b4"
More Information needed
|
[
"# Dataset Card for \"9bc865b4\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"9bc865b4\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"9bc865b4\"\n\nMore Information needed"
] |
3d73fa76b042531984068744a8d8ff9453d463ac
|
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
Goldyhghoul/LouisBeaters
|
[
"region:us"
] |
2023-09-18T16:01:35+00:00
|
{}
|
2023-09-18T16:16:29+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Dataset Name
## Dataset Description
- Homepage:
- Repository:
- Paper:
- Leaderboard:
- Point of Contact:
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
8,
24,
32,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9d19bd4b27fd56cbae55b6d718d99d4b72d4832c
|
# Dataset Card for "4e6d4d01"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
result-kand2-sdxl-wuerst-karlo/4e6d4d01
|
[
"region:us"
] |
2023-09-18T16:02:52+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 176, "num_examples": 10}], "download_size": 1328, "dataset_size": 176}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T16:02:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "4e6d4d01"
More Information needed
|
[
"# Dataset Card for \"4e6d4d01\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"4e6d4d01\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"4e6d4d01\"\n\nMore Information needed"
] |
655afb1df12e8eca97f790f3525b02418d549958
|
# Dataset Card for "b6ea8c05"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
result-kand2-sdxl-wuerst-karlo/b6ea8c05
|
[
"region:us"
] |
2023-09-18T16:02:55+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 176, "num_examples": 10}], "download_size": 1328, "dataset_size": 176}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T16:02:55+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "b6ea8c05"
More Information needed
|
[
"# Dataset Card for \"b6ea8c05\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"b6ea8c05\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"b6ea8c05\"\n\nMore Information needed"
] |
df63b0e2f182fa80b3f84aa1cbda1644e3a20b5a
|
# Dataset Card for "vhac_v2_chai_format_80k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
nguyenthanhdo/vhac_v2_chai_format_80k
|
[
"region:us"
] |
2023-09-18T16:05:37+00:00
|
{"dataset_info": {"features": [{"name": "model_input", "dtype": "string"}, {"name": "model_output", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 272113279.4640063, "num_examples": 80000}], "download_size": 130456890, "dataset_size": 272113279.4640063}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T16:05:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "vhac_v2_chai_format_80k"
More Information needed
|
[
"# Dataset Card for \"vhac_v2_chai_format_80k\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"vhac_v2_chai_format_80k\"\n\nMore Information needed"
] |
[
6,
24
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"vhac_v2_chai_format_80k\"\n\nMore Information needed"
] |
96ed83c9db16fb6ba6deca76d5d409e0a486a972
|
# Dataset of Hitamu Kyan
This is the dataset of Hitamu Kyan, containing 296 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 296 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 719 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 296 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 296 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 296 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 296 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 296 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 719 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 719 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 719 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/hitamu_kyan_futokunoguild
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-18T16:25:08+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-18T16:29:41+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of Hitamu Kyan
======================
This is the dataset of Hitamu Kyan, containing 296 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
6bdbea2367e4578e50b2040b5457cffc0abd0588
|
# Dataset Card for "whisper_largev2_test_results"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
linhqyy/whisper_largev2_test_results
|
[
"region:us"
] |
2023-09-18T16:30:58+00:00
|
{"dataset_info": {"features": [{"name": "predictions", "dtype": "string"}, {"name": "references", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 70565, "num_examples": 748}], "download_size": 36811, "dataset_size": 70565}}
|
2023-09-18T16:35:37+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "whisper_largev2_test_results"
More Information needed
|
[
"# Dataset Card for \"whisper_largev2_test_results\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"whisper_largev2_test_results\"\n\nMore Information needed"
] |
[
6,
24
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"whisper_largev2_test_results\"\n\nMore Information needed"
] |
11a4cae4d469f6bb0f3a9d37ee80710b267a3e83
|
# Dataset Card for "3500_more_movie_reviews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
pembelajarff/3500_more_movie_reviews
|
[
"region:us"
] |
2023-09-18T16:31:22+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "review", "dtype": "string"}, {"name": "review_length", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 4631655.01, "num_examples": 3592}, {"name": "validation", "num_bytes": 515774.5, "num_examples": 400}], "download_size": 3424005, "dataset_size": 5147429.51}}
|
2023-09-29T09:01:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "3500_more_movie_reviews"
More Information needed
|
[
"# Dataset Card for \"3500_more_movie_reviews\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"3500_more_movie_reviews\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"3500_more_movie_reviews\"\n\nMore Information needed"
] |
36b1c00241797b52a720661891cce1eebc974eb3
|
# Dataset Card for "Grocery_chatbot_text_classification_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
nelson2424/Grocery_chatbot_text_v1
|
[
"region:us"
] |
2023-09-18T16:38:04+00:00
|
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "category", "dtype": "string"}, {"name": "items", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 622317, "num_examples": 2482}], "download_size": 204878, "dataset_size": 622317}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-25T19:32:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "Grocery_chatbot_text_classification_v1"
More Information needed
|
[
"# Dataset Card for \"Grocery_chatbot_text_classification_v1\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"Grocery_chatbot_text_classification_v1\"\n\nMore Information needed"
] |
[
6,
24
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"Grocery_chatbot_text_classification_v1\"\n\nMore Information needed"
] |
69851b23e940c094ae4b3c469fbfc5f0b5b5f338
|
# Dataset of Maidena Ange
This is the dataset of Maidena Ange, containing 220 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 220 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 542 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 220 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 220 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 220 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 220 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 220 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 542 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 542 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 542 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/maidena_ange_futokunoguild
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-18T16:44:05+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-18T16:45:45+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of Maidena Ange
=======================
This is the dataset of Maidena Ange, containing 220 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
3eaba7a342f8e784e71b6c900e5699e5f86b2140
|
# Dataset Card for "animal_photorealistic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Falah/animal_photorealistic
|
[
"region:us"
] |
2023-09-18T16:51:46+00:00
|
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 169255, "num_examples": 1000}], "download_size": 24711, "dataset_size": 169255}}
|
2023-09-18T16:51:48+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "animal_photorealistic"
More Information needed
|
[
"# Dataset Card for \"animal_photorealistic\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"animal_photorealistic\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"animal_photorealistic\"\n\nMore Information needed"
] |
1f31e0d48c52541367346d4e0b89275ae3418fd9
|
# Dataset Card for "urdu-ner"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
mirfan899/urdu-ner
|
[
"region:us"
] |
2023-09-18T16:57:06+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "tokens", "sequence": "string"}, {"name": "ner_tags", "sequence": {"class_label": {"names": {"0": "TIME", "1": "PERSON", "2": "ORGANIZATION", "3": "O", "4": "NUMBER", "5": "LOCATION", "6": "DESIGNATION", "7": "DATE"}}}}], "splits": [{"name": "train", "num_bytes": 12556540, "num_examples": 18172}, {"name": "validation", "num_bytes": 5412660, "num_examples": 7788}, {"name": "test", "num_bytes": 5412660, "num_examples": 7788}], "download_size": 4173687, "dataset_size": 23381860}}
|
2023-09-18T16:57:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "urdu-ner"
More Information needed
|
[
"# Dataset Card for \"urdu-ner\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"urdu-ner\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"urdu-ner\"\n\nMore Information needed"
] |
5dd1aa09552997721b9535ef365e392ab8df8229
|
# Dataset of Toxico Dannar
This is the dataset of Toxico Dannar, containing 270 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 270 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 613 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 270 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 270 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 270 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 270 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 270 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 613 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 613 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 613 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/toxico_dannar_futokunoguild
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-18T17:02:43+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-18T17:09:10+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of Toxico Dannar
========================
This is the dataset of Toxico Dannar, containing 270 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
72d8280fcc4e88242d5b9d81735073c0ba88b42f
|
# Dataset Card for "processed_demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
fundrais123/processed_demo
|
[
"region:us"
] |
2023-09-18T17:14:39+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "package_name", "dtype": "string"}, {"name": "review", "dtype": "string"}, {"name": "date", "dtype": "string"}, {"name": "star", "dtype": "int64"}, {"name": "version_id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 1508, "num_examples": 5}, {"name": "test", "num_bytes": 956, "num_examples": 5}], "download_size": 9451, "dataset_size": 2464}}
|
2023-09-18T17:14:42+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "processed_demo"
More Information needed
|
[
"# Dataset Card for \"processed_demo\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"processed_demo\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"processed_demo\"\n\nMore Information needed"
] |
bc1cd00ab7c9c9cfbdff0e418a100e411b9ddd2f
|
# Dataset of Hanabata Nohkins
This is the dataset of Hanabata Nohkins, containing 225 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 225 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 523 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 225 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 225 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 225 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 225 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 225 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 523 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 523 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 523 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/hanabata_nohkins_futokunoguild
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-18T17:23:22+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-18T17:27:58+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of Hanabata Nohkins
===========================
This is the dataset of Hanabata Nohkins, containing 225 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
4450a0e5731783d46b00356dad38a9b63bcf8b96
|
This is my collection of prompts to increase my productivity as a co-founder and CEO at Hugging Face
|
clem/prompts
|
[
"license:apache-2.0",
"region:us"
] |
2023-09-18T17:28:28+00:00
|
{"license": "apache-2.0"}
|
2023-09-22T00:19:50+00:00
|
[] |
[] |
TAGS
#license-apache-2.0 #region-us
|
This is my collection of prompts to increase my productivity as a co-founder and CEO at Hugging Face
|
[] |
[
"TAGS\n#license-apache-2.0 #region-us \n"
] |
[
14
] |
[
"passage: TAGS\n#license-apache-2.0 #region-us \n"
] |
d4eb0dd6e4d67a4edfc8eb1ef1db3db9b15145e3
|
# Dataset of Enome
This is the dataset of Enome, containing 146 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 146 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 350 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 146 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 146 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 146 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 146 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 146 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 350 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 350 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 350 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
CyberHarem/enome_futokunoguild
|
[
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] |
2023-09-18T17:37:33+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]}
|
2023-09-18T17:38:54+00:00
|
[] |
[] |
TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
|
Dataset of Enome
================
This is the dataset of Enome, containing 146 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
|
[] |
[
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
431470a8fa282b6f70f22a680480bf716152298c
|
# Dataset Card for "elm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
TinyPixel/elm
|
[
"region:us"
] |
2023-09-18T17:50:39+00:00
|
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2542932, "num_examples": 1073}], "download_size": 1390964, "dataset_size": 2542932}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2024-01-29T14:05:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "elm"
More Information needed
|
[
"# Dataset Card for \"elm\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"elm\"\n\nMore Information needed"
] |
[
6,
11
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"elm\"\n\nMore Information needed"
] |
451e4b7eb50172e0116aa255098bc65f34365f0d
|
# Dataset Card for "srbd1_v2_annotated"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Lancelot53/srbd1_v2_annotated
|
[
"region:us"
] |
2023-09-18T17:54:42+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "xml", "dtype": "string"}, {"name": "html", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "annotated", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 29595348.121978022, "num_examples": 1077}], "download_size": 3598400, "dataset_size": 29595348.121978022}}
|
2023-09-18T18:03:14+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "srbd1_v2_annotated"
More Information needed
|
[
"# Dataset Card for \"srbd1_v2_annotated\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"srbd1_v2_annotated\"\n\nMore Information needed"
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"srbd1_v2_annotated\"\n\nMore Information needed"
] |
c8b8c08aabf0e93a86604f9c2ac69db6ff633a22
|
# Dataset Card for "MAD-Main-Test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Ali-C137/MAD-Main-Test
|
[
"region:us"
] |
2023-09-18T18:05:09+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "GenId", "dtype": "int64"}, {"name": "SubId", "dtype": "int64"}, {"name": "DatasetName", "dtype": "string"}, {"name": "DatasetLink", "dtype": "string"}, {"name": "Text", "dtype": "string"}, {"name": "MetaData", "struct": [{"name": "__index_level_0__", "dtype": "int64"}, {"name": "created_date", "dtype": "string"}, {"name": "deleted", "dtype": "bool"}, {"name": "detoxify", "dtype": "null"}, {"name": "emojis", "struct": [{"name": "count", "sequence": "int32"}, {"name": "name", "sequence": "string"}]}, {"name": "id", "dtype": "string"}, {"name": "labels", "struct": [{"name": "count", "sequence": "int32"}, {"name": "name", "sequence": "string"}, {"name": "value", "sequence": "float64"}]}, {"name": "lang", "dtype": "string"}, {"name": "message_id", "dtype": "string"}, {"name": "message_tree_id", "dtype": "string"}, {"name": "model_name", "dtype": "null"}, {"name": "parent_id", "dtype": "string"}, {"name": "rank", "dtype": "float64"}, {"name": "review_count", "dtype": "int32"}, {"name": "review_result", "dtype": "bool"}, {"name": "role", "dtype": "string"}, {"name": "synthetic", "dtype": "bool"}, {"name": "tree_state", "dtype": "string"}, {"name": "user_id", "dtype": "string"}]}, {"name": "ConcatenatedText", "dtype": "int64"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 87616889, "num_examples": 67073}], "download_size": 34138667, "dataset_size": 87616889}}
|
2023-09-18T18:05:12+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "MAD-Main-Test"
More Information needed
|
[
"# Dataset Card for \"MAD-Main-Test\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"MAD-Main-Test\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"MAD-Main-Test\"\n\nMore Information needed"
] |
9d085aba8fd7bfc303fcf9333b0e3ef791a75508
|
# Dataset Card for "srbd1_v2_annotated_segmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Lancelot53/srbd1_v2_annotated_segmented
|
[
"region:us"
] |
2023-09-18T18:14:42+00:00
|
{"dataset_info": {"features": [{"name": "html", "dtype": "string"}, {"name": "response", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1623614, "num_examples": 2434}], "download_size": 525557, "dataset_size": 1623614}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T18:14:48+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "srbd1_v2_annotated_segmented"
More Information needed
|
[
"# Dataset Card for \"srbd1_v2_annotated_segmented\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"srbd1_v2_annotated_segmented\"\n\nMore Information needed"
] |
[
6,
25
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"srbd1_v2_annotated_segmented\"\n\nMore Information needed"
] |
1753908022c5cfa9c359d0685d70f66a5a14897b
|
# Dataset Card for YOSM
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** [Iyanuoluwa/YOSM](https://github.com/IyanuSh/YOSM)
- **Paper:** [A new Yorùbá Sentiment Corpus for Nigerian/Nollywood Movie Reviews](https://arxiv.org/pdf/2204.09711.pdf)
- **Point of Contact:** [Iyanuoluwa Shode](mailto:[email protected])
### Dataset Summary
YOSM is the first Yorùbá sentiment corpus for Nollywood movie reviews. The reviews were collected from movie reviews websites - IMDB, Rotten Tomatoes, LetterboxD, Cinemapointer, and Nollyrated.
### Languages
Yorùbá (ISO 639-1: yo) - the third most spoken indigenous African language with over 50 million speakers.
## Dataset Structure
### Data Instances
An instance consists of a movie review and the corresponding class label.
### Data Fields
- `yo_review`: A movie review in Yorùbá
- `sentiment`: The label describing the sentiment of the movie review.
### Data Splits
The YOSM dataset has 3 splits: _train_, _dev_, and _test_. Below are the statistics for Version 3.0.0 of the dataset.
| Dataset Split | Number of Instances in Split |
| ------------- | ------------------------------------------- |
| Train | 800 |
| Development | 200 |
| Test | 500 |
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
|
asoria/sample-script
|
[
"task_categories:text-classification",
"task_ids:sentiment-analysis",
"annotations_creators:expert-generated",
"language_creators:found",
"multilinguality:monolingual",
"size_categories:1K<n<10K",
"source_datasets:original",
"language:yo",
"license:unknown",
"movie reviews",
"nollywood",
"arxiv:2204.09711",
"region:us"
] |
2023-09-18T18:27:00+00:00
|
{"annotations_creators": ["expert-generated"], "language_creators": ["found"], "language": ["yo"], "license": ["unknown"], "multilinguality": ["monolingual"], "size_categories": ["1K<n<10K"], "source_datasets": ["original"], "task_categories": ["text-classification"], "task_ids": ["sentiment-analysis"], "tags": ["movie reviews", "nollywood"]}
|
2023-09-18T18:27:15+00:00
|
[
"2204.09711"
] |
[
"yo"
] |
TAGS
#task_categories-text-classification #task_ids-sentiment-analysis #annotations_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-original #language-Yoruba #license-unknown #movie reviews #nollywood #arxiv-2204.09711 #region-us
|
Dataset Card for YOSM
=====================
Table of Contents
-----------------
* Dataset Description
+ Dataset Summary
+ Supported Tasks and Leaderboards
+ Languages
* Dataset Structure
+ Data Instances
+ Data Fields
+ Data Splits
* Dataset Creation
+ Curation Rationale
+ Source Data
+ Annotations
+ Personal and Sensitive Information
* Considerations for Using the Data
+ Social Impact of Dataset
+ Discussion of Biases
+ Other Known Limitations
* Additional Information
+ Dataset Curators
+ Licensing Information
+ Citation Information
+ Contributions
Dataset Description
-------------------
* Homepage:
* Repository: Iyanuoluwa/YOSM
* Paper: A new Yorùbá Sentiment Corpus for Nigerian/Nollywood Movie Reviews
* Point of Contact: Iyanuoluwa Shode
### Dataset Summary
YOSM is the first Yorùbá sentiment corpus for Nollywood movie reviews. The reviews were collected from movie reviews websites - IMDB, Rotten Tomatoes, LetterboxD, Cinemapointer, and Nollyrated.
### Languages
Yorùbá (ISO 639-1: yo) - the third most spoken indigenous African language with over 50 million speakers.
Dataset Structure
-----------------
### Data Instances
An instance consists of a movie review and the corresponding class label.
### Data Fields
* 'yo\_review': A movie review in Yorùbá
* 'sentiment': The label describing the sentiment of the movie review.
### Data Splits
The YOSM dataset has 3 splits: *train*, *dev*, and *test*. Below are the statistics for Version 3.0.0 of the dataset.
### Data Splits
Dataset Creation
----------------
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
Considerations for Using the Data
---------------------------------
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
Additional Information
----------------------
### Dataset Curators
### Licensing Information
### Contributions
|
[
"### Dataset Summary\n\n\nYOSM is the first Yorùbá sentiment corpus for Nollywood movie reviews. The reviews were collected from movie reviews websites - IMDB, Rotten Tomatoes, LetterboxD, Cinemapointer, and Nollyrated.",
"### Languages\n\n\nYorùbá (ISO 639-1: yo) - the third most spoken indigenous African language with over 50 million speakers.\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nAn instance consists of a movie review and the corresponding class label.",
"### Data Fields\n\n\n* 'yo\\_review': A movie review in Yorùbá\n* 'sentiment': The label describing the sentiment of the movie review.",
"### Data Splits\n\n\nThe YOSM dataset has 3 splits: *train*, *dev*, and *test*. Below are the statistics for Version 3.0.0 of the dataset.",
"### Data Splits\n\n\nDataset Creation\n----------------",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#task_categories-text-classification #task_ids-sentiment-analysis #annotations_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-original #language-Yoruba #license-unknown #movie reviews #nollywood #arxiv-2204.09711 #region-us \n",
"### Dataset Summary\n\n\nYOSM is the first Yorùbá sentiment corpus for Nollywood movie reviews. The reviews were collected from movie reviews websites - IMDB, Rotten Tomatoes, LetterboxD, Cinemapointer, and Nollyrated.",
"### Languages\n\n\nYorùbá (ISO 639-1: yo) - the third most spoken indigenous African language with over 50 million speakers.\n\n\nDataset Structure\n-----------------",
"### Data Instances\n\n\nAn instance consists of a movie review and the corresponding class label.",
"### Data Fields\n\n\n* 'yo\\_review': A movie review in Yorùbá\n* 'sentiment': The label describing the sentiment of the movie review.",
"### Data Splits\n\n\nThe YOSM dataset has 3 splits: *train*, *dev*, and *test*. Below are the statistics for Version 3.0.0 of the dataset.",
"### Data Splits\n\n\nDataset Creation\n----------------",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations\n\n\nAdditional Information\n----------------------",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
104,
57,
40,
21,
38,
45,
11,
7,
4,
10,
10,
5,
5,
9,
18,
7,
8,
14,
6,
6,
5
] |
[
"passage: TAGS\n#task_categories-text-classification #task_ids-sentiment-analysis #annotations_creators-expert-generated #language_creators-found #multilinguality-monolingual #size_categories-1K<n<10K #source_datasets-original #language-Yoruba #license-unknown #movie reviews #nollywood #arxiv-2204.09711 #region-us \n### Dataset Summary\n\n\nYOSM is the first Yorùbá sentiment corpus for Nollywood movie reviews. The reviews were collected from movie reviews websites - IMDB, Rotten Tomatoes, LetterboxD, Cinemapointer, and Nollyrated.### Languages\n\n\nYorùbá (ISO 639-1: yo) - the third most spoken indigenous African language with over 50 million speakers.\n\n\nDataset Structure\n-----------------### Data Instances\n\n\nAn instance consists of a movie review and the corresponding class label.### Data Fields\n\n\n* 'yo\\_review': A movie review in Yorùbá\n* 'sentiment': The label describing the sentiment of the movie review.### Data Splits\n\n\nThe YOSM dataset has 3 splits: *train*, *dev*, and *test*. Below are the statistics for Version 3.0.0 of the dataset.### Data Splits\n\n\nDataset Creation\n----------------### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information\n\n\nConsiderations for Using the Data\n---------------------------------### Social Impact of Dataset### Discussion of Biases### Other Known Limitations\n\n\nAdditional Information\n----------------------### Dataset Curators### Licensing Information### Contributions"
] |
aeea5197e8dd264164254fd4af33e815ba5f0288
|
# Dataset Card for "wiki_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
yoandrey/wiki_text
|
[
"region:us"
] |
2023-09-18T18:37:01+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "int32"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 14903122606, "num_examples": 35167920}], "download_size": 9399662602, "dataset_size": 14903122606}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T18:49:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "wiki_text"
More Information needed
|
[
"# Dataset Card for \"wiki_text\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"wiki_text\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"wiki_text\"\n\nMore Information needed"
] |
66bea06c454ce622240edd95ef5829788e154ce2
|
# Dataset Card for "ted-transcriptions-cantonese"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
indiejoseph/ted-transcriptions-cantonese
|
[
"region:us"
] |
2023-09-18T18:49:04+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1569597, "num_examples": 249}], "download_size": 1066997, "dataset_size": 1569597}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T18:49:07+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "ted-transcriptions-cantonese"
More Information needed
|
[
"# Dataset Card for \"ted-transcriptions-cantonese\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"ted-transcriptions-cantonese\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"ted-transcriptions-cantonese\"\n\nMore Information needed"
] |
d522246e305afb746ce676377c79da72ec19d99d
|
# Dataset Card for Evaluation run of migtissera/Synthia-34B-v1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/migtissera/Synthia-34B-v1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [migtissera/Synthia-34B-v1.2](https://huggingface.co/migtissera/Synthia-34B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T20:05:34.645170](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2/blob/main/results_2023-09-18T20-05-34.645170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5320903185183409,
"acc_stderr": 0.03517517994960793,
"acc_norm": 0.5358397153796313,
"acc_norm_stderr": 0.03516397638431902,
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.4467341818408572,
"mc2_stderr": 0.014969799807071376
},
"harness|arc:challenge|25": {
"acc": 0.5119453924914675,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5486348122866894,
"acc_norm_stderr": 0.01454210456995527
},
"harness|hellaswag|10": {
"acc": 0.5587532364070902,
"acc_stderr": 0.00495521278783238,
"acc_norm": 0.7432782314280024,
"acc_norm_stderr": 0.004359318206428689
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.506578947368421,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.506578947368421,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5433962264150943,
"acc_stderr": 0.030656748696739428,
"acc_norm": 0.5433962264150943,
"acc_norm_stderr": 0.030656748696739428
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111502,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111502
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.03812400565974834,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.03812400565974834
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.027575960723278243,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.027575960723278243
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031596,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031596
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.032018671228777947,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.032018671228777947
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114982,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114982
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.03983798306659806,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.03983798306659806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6954128440366972,
"acc_stderr": 0.019732299420354052,
"acc_norm": 0.6954128440366972,
"acc_norm_stderr": 0.019732299420354052
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4027777777777778,
"acc_stderr": 0.033448873829978666,
"acc_norm": 0.4027777777777778,
"acc_norm_stderr": 0.033448873829978666
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475544,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475544
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7172995780590717,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.7172995780590717,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801714,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801714
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.0276019213814176,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.0276019213814176
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6743295019157088,
"acc_stderr": 0.016757989458549675,
"acc_norm": 0.6743295019157088,
"acc_norm_stderr": 0.016757989458549675
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5867052023121387,
"acc_stderr": 0.02651126136940925,
"acc_norm": 0.5867052023121387,
"acc_norm_stderr": 0.02651126136940925
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310263,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.028580341065138296,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.028580341065138296
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.027667138569422708,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.027667138569422708
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.378748370273794,
"acc_stderr": 0.012389052105003732,
"acc_norm": 0.378748370273794,
"acc_norm_stderr": 0.012389052105003732
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.020217030653186457,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.020217030653186457
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.047381987035454834,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.047381987035454834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.031343283582089536,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.031343283582089536
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03615507630310935,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03615507630310935
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2998776009791922,
"mc1_stderr": 0.01604035296671362,
"mc2": 0.4467341818408572,
"mc2_stderr": 0.014969799807071376
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2
|
[
"region:us"
] |
2023-09-18T19:05:59+00:00
|
{"pretty_name": "Evaluation run of migtissera/Synthia-34B-v1.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [migtissera/Synthia-34B-v1.2](https://huggingface.co/migtissera/Synthia-34B-v1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T20:05:34.645170](https://huggingface.co/datasets/open-llm-leaderboard/details_migtissera__Synthia-34B-v1.2/blob/main/results_2023-09-18T20-05-34.645170.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5320903185183409,\n \"acc_stderr\": 0.03517517994960793,\n \"acc_norm\": 0.5358397153796313,\n \"acc_norm_stderr\": 0.03516397638431902,\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.4467341818408572,\n \"mc2_stderr\": 0.014969799807071376\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5119453924914675,\n \"acc_stderr\": 0.014607220340597171,\n \"acc_norm\": 0.5486348122866894,\n \"acc_norm_stderr\": 0.01454210456995527\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5587532364070902,\n \"acc_stderr\": 0.00495521278783238,\n \"acc_norm\": 0.7432782314280024,\n \"acc_norm_stderr\": 0.004359318206428689\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.506578947368421,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.506578947368421,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5433962264150943,\n \"acc_stderr\": 0.030656748696739428,\n \"acc_norm\": 0.5433962264150943,\n \"acc_norm_stderr\": 0.030656748696739428\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n \"acc_stderr\": 0.04140685639111502,\n \"acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.04140685639111502\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n \"acc_stderr\": 0.03812400565974834,\n \"acc_norm\": 0.5028901734104047,\n \"acc_norm_stderr\": 0.03812400565974834\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421296,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421296\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n \"acc_stderr\": 0.027575960723278243,\n \"acc_norm\": 0.6225806451612903,\n \"acc_norm_stderr\": 0.027575960723278243\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031596,\n \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031596\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.032018671228777947,\n \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.032018671228777947\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.02534267129380725,\n \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.02534267129380725\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114982,\n \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114982\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.39072847682119205,\n \"acc_stderr\": 0.03983798306659806,\n \"acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.03983798306659806\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6954128440366972,\n \"acc_stderr\": 0.019732299420354052,\n \"acc_norm\": 0.6954128440366972,\n \"acc_norm_stderr\": 0.019732299420354052\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4027777777777778,\n \"acc_stderr\": 0.033448873829978666,\n \"acc_norm\": 0.4027777777777778,\n \"acc_norm_stderr\": 0.033448873829978666\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475544,\n \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475544\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7172995780590717,\n \"acc_stderr\": 0.02931281415395593,\n \"acc_norm\": 0.7172995780590717,\n \"acc_norm_stderr\": 0.02931281415395593\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.04616631111801714,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.04616631111801714\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n \"acc_stderr\": 0.0276019213814176,\n \"acc_norm\": 0.7692307692307693,\n \"acc_norm_stderr\": 0.0276019213814176\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6743295019157088,\n \"acc_stderr\": 0.016757989458549675,\n \"acc_norm\": 0.6743295019157088,\n \"acc_norm_stderr\": 0.016757989458549675\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5867052023121387,\n \"acc_stderr\": 0.02651126136940925,\n \"acc_norm\": 0.5867052023121387,\n \"acc_norm_stderr\": 0.02651126136940925\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n \"acc_stderr\": 0.016051419760310263,\n \"acc_norm\": 0.35977653631284917,\n \"acc_norm_stderr\": 0.016051419760310263\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.028580341065138296,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.028580341065138296\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.027667138569422708,\n \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.027667138569422708\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.378748370273794,\n \"acc_stderr\": 0.012389052105003732,\n \"acc_norm\": 0.378748370273794,\n \"acc_norm_stderr\": 0.012389052105003732\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.48366013071895425,\n \"acc_stderr\": 0.020217030653186457,\n \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.020217030653186457\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n \"acc_stderr\": 0.047381987035454834,\n \"acc_norm\": 0.5727272727272728,\n \"acc_norm_stderr\": 0.047381987035454834\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n \"acc_stderr\": 0.031343283582089536,\n \"acc_norm\": 0.7313432835820896,\n \"acc_norm_stderr\": 0.031343283582089536\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03615507630310935,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03615507630310935\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2998776009791922,\n \"mc1_stderr\": 0.01604035296671362,\n \"mc2\": 0.4467341818408572,\n \"mc2_stderr\": 0.014969799807071376\n }\n}\n```", "repo_url": "https://huggingface.co/migtissera/Synthia-34B-v1.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|arc:challenge|25_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hellaswag|10_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T20-05-34.645170.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T20_05_34.645170", "path": ["results_2023-09-18T20-05-34.645170.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T20-05-34.645170.parquet"]}]}]}
|
2023-09-18T19:07:00+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of migtissera/Synthia-34B-v1.2
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model migtissera/Synthia-34B-v1.2 on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-18T20:05:34.645170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of migtissera/Synthia-34B-v1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Synthia-34B-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T20:05:34.645170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of migtissera/Synthia-34B-v1.2",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Synthia-34B-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T20:05:34.645170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of migtissera/Synthia-34B-v1.2## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model migtissera/Synthia-34B-v1.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T20:05:34.645170(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
74dfc20cf7618b7164f1fc47a3851dd60d7324d0
|
# Dataset Card for "ai-hdlcoder-pretokenized-dataset-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
AWfaw/ai-hdlcoder-pretokenized-dataset-train
|
[
"region:us"
] |
2023-09-18T19:27:38+00:00
|
{"dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}, {"name": "ratio_char_token", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 1052402380, "num_examples": 53550}], "download_size": 337098653, "dataset_size": 1052402380}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-20T17:37:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "ai-hdlcoder-pretokenized-dataset-train"
More Information needed
|
[
"# Dataset Card for \"ai-hdlcoder-pretokenized-dataset-train\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"ai-hdlcoder-pretokenized-dataset-train\"\n\nMore Information needed"
] |
[
6,
26
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"ai-hdlcoder-pretokenized-dataset-train\"\n\nMore Information needed"
] |
55314170978c59305bac476c4f7a9b8abfff5de7
|
# Dataset Card for "ger_micro_benchmark"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
jphme/ger_micro_benchmark
|
[
"region:us"
] |
2023-09-18T19:55:08+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "eval", "path": "data/eval-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "subject", "dtype": "string"}], "splits": [{"name": "eval", "num_bytes": 69430, "num_examples": 200}], "download_size": 39957, "dataset_size": 69430}}
|
2023-09-18T20:15:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "ger_micro_benchmark"
More Information needed
|
[
"# Dataset Card for \"ger_micro_benchmark\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"ger_micro_benchmark\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"ger_micro_benchmark\"\n\nMore Information needed"
] |
990feed75052f5394618125a892aa33434040640
|
# Dataset Card for "RegulatoryReqs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
mariapaulaf/RegulatoryReqs
|
[
"region:us"
] |
2023-09-18T20:06:11+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "input_ids", "sequence": "int32"}], "splits": [{"name": "train", "num_bytes": 151700.0, "num_examples": 37}, {"name": "test", "num_bytes": 20500.0, "num_examples": 5}], "download_size": 71240, "dataset_size": 172200.0}}
|
2023-09-18T20:40:14+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "RegulatoryReqs"
More Information needed
|
[
"# Dataset Card for \"RegulatoryReqs\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"RegulatoryReqs\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"RegulatoryReqs\"\n\nMore Information needed"
] |
113bd56ac2b12f2eb6132c5287e624192b60c76b
|
This is the set of math SAT questions from the May 2023 SAT, taken from here: https://www.mcelroytutoring.com/lower.php?url=44-official-sat-pdfs-and-82-official-act-pdf-practice-tests-free.
Questions that included images were not included but all other math questions, including those that have tables were included.
|
mcaleste/sat_multiple_choice_math_may_23
|
[
"size_categories:n<1K",
"language:en",
"region:us"
] |
2023-09-18T20:30:36+00:00
|
{"language": ["en"], "size_categories": ["n<1K"]}
|
2023-10-14T01:23:29+00:00
|
[] |
[
"en"
] |
TAGS
#size_categories-n<1K #language-English #region-us
|
This is the set of math SAT questions from the May 2023 SAT, taken from here: URL
Questions that included images were not included but all other math questions, including those that have tables were included.
|
[] |
[
"TAGS\n#size_categories-n<1K #language-English #region-us \n"
] |
[
20
] |
[
"passage: TAGS\n#size_categories-n<1K #language-English #region-us \n"
] |
361ba20fde00c2f175855e95637649b5dec9a4b9
|
# Bangumi Image Base of The Demon Girl Next Door
This is the image base of bangumi The Demon Girl Next Door, we detected 18 characters, 3728 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1497 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 41 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 43 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 14 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 139 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 149 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 96 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 18 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 8 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 16 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 116 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 364 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 823 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 136 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 46 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 5 | [Download](15/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 16 | 105 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 112 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
BangumiBase/thedemongirlnextdoor
|
[
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] |
2023-09-18T21:13:56+00:00
|
{"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]}
|
2023-09-29T08:18:09+00:00
|
[] |
[] |
TAGS
#size_categories-1K<n<10K #license-mit #art #region-us
|
Bangumi Image Base of The Demon Girl Next Door
==============================================
This is the image base of bangumi The Demon Girl Next Door, we detected 18 characters, 3728 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
|
[] |
[
"TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
] |
[
25
] |
[
"passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
] |
d572da93aa323d552a99973ddb9cd6138d00b6c7
|
# Bangumi Image Base of Jashin-chan Dropkick X
This is the image base of bangumi Jashin-chan Dropkick X, we detected 19 characters, 795 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 80 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 124 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 69 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 15 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 27 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 23 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 40 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 6 | [Download](7/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 8 | 24 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 29 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 33 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 55 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 58 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 39 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 26 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 18 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 19 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 5 | [Download](17/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| noise | 105 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
BangumiBase/jashinchandropkickx
|
[
"size_categories:n<1K",
"license:mit",
"art",
"region:us"
] |
2023-09-18T21:14:33+00:00
|
{"license": "mit", "size_categories": ["n<1K"], "tags": ["art"]}
|
2023-09-29T08:24:49+00:00
|
[] |
[] |
TAGS
#size_categories-n<1K #license-mit #art #region-us
|
Bangumi Image Base of Jashin-chan Dropkick X
============================================
This is the image base of bangumi Jashin-chan Dropkick X, we detected 19 characters, 795 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
|
[] |
[
"TAGS\n#size_categories-n<1K #license-mit #art #region-us \n"
] |
[
23
] |
[
"passage: TAGS\n#size_categories-n<1K #license-mit #art #region-us \n"
] |
224a25873893b0e10eb8855d51e0c9967c50028c
|
# Dataset Card for "8000-java-preprocessed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
DavidMOBrien/8000-java-preprocessed
|
[
"region:us"
] |
2023-09-18T21:57:53+00:00
|
{"dataset_info": {"features": [{"name": "before", "dtype": "string"}, {"name": "after", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 563226571, "num_examples": 343959}, {"name": "test", "num_bytes": 77867200, "num_examples": 48017}, {"name": "valid", "num_bytes": 74511240, "num_examples": 48232}], "download_size": 297216874, "dataset_size": 715605011}}
|
2023-09-18T21:59:36+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "8000-java-preprocessed"
More Information needed
|
[
"# Dataset Card for \"8000-java-preprocessed\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"8000-java-preprocessed\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"8000-java-preprocessed\"\n\nMore Information needed"
] |
8bb5ab8ca8d02df4d471499e41249d5ce8cf4a0b
|
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
bsankar/github-issues
|
[
"region:us"
] |
2023-09-18T22:33:49+00:00
|
{"dataset_info": {"features": [{"name": "url", "dtype": "string"}, {"name": "repository_url", "dtype": "string"}, {"name": "labels_url", "dtype": "string"}, {"name": "comments_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "number", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "user", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "labels", "list": [{"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "name", "dtype": "string"}, {"name": "color", "dtype": "string"}, {"name": "default", "dtype": "bool"}, {"name": "description", "dtype": "string"}]}, {"name": "state", "dtype": "string"}, {"name": "locked", "dtype": "bool"}, {"name": "assignee", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "assignees", "list": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "milestone", "struct": [{"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "labels_url", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "number", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "creator", "struct": [{"name": "login", "dtype": "string"}, {"name": "id", "dtype": "int64"}, {"name": "node_id", "dtype": "string"}, {"name": "avatar_url", "dtype": "string"}, {"name": "gravatar_id", "dtype": "string"}, {"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "followers_url", "dtype": "string"}, {"name": "following_url", "dtype": "string"}, {"name": "gists_url", "dtype": "string"}, {"name": "starred_url", "dtype": "string"}, {"name": "subscriptions_url", "dtype": "string"}, {"name": "organizations_url", "dtype": "string"}, {"name": "repos_url", "dtype": "string"}, {"name": "events_url", "dtype": "string"}, {"name": "received_events_url", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "site_admin", "dtype": "bool"}]}, {"name": "open_issues", "dtype": "int64"}, {"name": "closed_issues", "dtype": "int64"}, {"name": "state", "dtype": "string"}, {"name": "created_at", "dtype": "timestamp[s]"}, {"name": "updated_at", "dtype": "timestamp[s]"}, {"name": "due_on", "dtype": "null"}, {"name": "closed_at", "dtype": "null"}]}, {"name": "comments", "sequence": "string"}, {"name": "created_at", "dtype": "timestamp[s]"}, {"name": "updated_at", "dtype": "timestamp[s]"}, {"name": "closed_at", "dtype": "timestamp[s]"}, {"name": "author_association", "dtype": "string"}, {"name": "active_lock_reason", "dtype": "null"}, {"name": "draft", "dtype": "bool"}, {"name": "pull_request", "struct": [{"name": "url", "dtype": "string"}, {"name": "html_url", "dtype": "string"}, {"name": "diff_url", "dtype": "string"}, {"name": "patch_url", "dtype": "string"}, {"name": "merged_at", "dtype": "timestamp[s]"}]}, {"name": "body", "dtype": "string"}, {"name": "reactions", "struct": [{"name": "url", "dtype": "string"}, {"name": "total_count", "dtype": "int64"}, {"name": "+1", "dtype": "int64"}, {"name": "-1", "dtype": "int64"}, {"name": "laugh", "dtype": "int64"}, {"name": "hooray", "dtype": "int64"}, {"name": "confused", "dtype": "int64"}, {"name": "heart", "dtype": "int64"}, {"name": "rocket", "dtype": "int64"}, {"name": "eyes", "dtype": "int64"}]}, {"name": "timeline_url", "dtype": "string"}, {"name": "performed_via_github_app", "dtype": "null"}, {"name": "state_reason", "dtype": "string"}, {"name": "is_pull_request", "dtype": "bool"}, {"name": "is_closed", "dtype": "bool"}, {"name": "close_time", "dtype": "duration[us]"}], "splits": [{"name": "train", "num_bytes": 12125043, "num_examples": 1000}], "download_size": 3282501, "dataset_size": 12125043}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T22:33:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "github-issues"
More Information needed
|
[
"# Dataset Card for \"github-issues\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"github-issues\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"github-issues\"\n\nMore Information needed"
] |
47b40e628b9cb97415b253f931950f7fe3a001cf
|
# Dataset Card for "8000-java-preprocessed-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
DavidMOBrien/8000-java-preprocessed-v2
|
[
"region:us"
] |
2023-09-18T22:38:59+00:00
|
{"dataset_info": {"features": [{"name": "before", "dtype": "string"}, {"name": "after", "dtype": "string"}, {"name": "repo", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 556419873, "num_examples": 322448}, {"name": "test", "num_bytes": 76892752, "num_examples": 44883}, {"name": "valid", "num_bytes": 73527268, "num_examples": 45083}], "download_size": 292278962, "dataset_size": 706839893}}
|
2023-09-18T22:40:48+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "8000-java-preprocessed-v2"
More Information needed
|
[
"# Dataset Card for \"8000-java-preprocessed-v2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"8000-java-preprocessed-v2\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"8000-java-preprocessed-v2\"\n\nMore Information needed"
] |
3c9611415b027e0c8e2215790326bc2ad5f08ef7
|
# Dataset Card for Evaluation run of lloorree/kssht-dahj-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/kssht-dahj-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/kssht-dahj-70b](https://huggingface.co/lloorree/kssht-dahj-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__kssht-dahj-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T23:50:58.093131](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-dahj-70b/blob/main/results_2023-09-18T23-50-58.093131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7033014017574061,
"acc_stderr": 0.03081446175839962,
"acc_norm": 0.7072547203046122,
"acc_norm_stderr": 0.03078306684205309,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5891645864509103,
"mc2_stderr": 0.015115214729699759
},
"harness|arc:challenge|25": {
"acc": 0.6612627986348123,
"acc_stderr": 0.013830568927974332,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403515
},
"harness|hellaswag|10": {
"acc": 0.6867157936666003,
"acc_stderr": 0.0046288092584835265,
"acc_norm": 0.8730332603067118,
"acc_norm_stderr": 0.003322552829608905
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.030643607071677098,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.030643607071677098
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7283018867924528,
"acc_stderr": 0.027377706624670713,
"acc_norm": 0.7283018867924528,
"acc_norm_stderr": 0.027377706624670713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.032166008088022675,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.032166008088022675
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.029896145682095455,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.029896145682095455
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.02845038880528437,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.02845038880528437
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9326424870466321,
"acc_stderr": 0.018088393839078912,
"acc_norm": 0.9326424870466321,
"acc_norm_stderr": 0.018088393839078912
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7282051282051282,
"acc_stderr": 0.02255655101013236,
"acc_norm": 0.7282051282051282,
"acc_norm_stderr": 0.02255655101013236
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7815126050420168,
"acc_stderr": 0.02684151432295894,
"acc_norm": 0.7815126050420168,
"acc_norm_stderr": 0.02684151432295894
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.46357615894039733,
"acc_stderr": 0.04071636065944215,
"acc_norm": 0.46357615894039733,
"acc_norm_stderr": 0.04071636065944215
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.908256880733945,
"acc_stderr": 0.012376323409137103,
"acc_norm": 0.908256880733945,
"acc_norm_stderr": 0.012376323409137103
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.03362277436608043,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.03362277436608043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8987341772151899,
"acc_stderr": 0.019637720526065498,
"acc_norm": 0.8987341772151899,
"acc_norm_stderr": 0.019637720526065498
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7982062780269058,
"acc_stderr": 0.02693611191280227,
"acc_norm": 0.7982062780269058,
"acc_norm_stderr": 0.02693611191280227
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8702290076335878,
"acc_stderr": 0.029473649496907065,
"acc_norm": 0.8702290076335878,
"acc_norm_stderr": 0.029473649496907065
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8735632183908046,
"acc_stderr": 0.011884488905895538,
"acc_norm": 0.8735632183908046,
"acc_norm_stderr": 0.011884488905895538
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6458100558659218,
"acc_stderr": 0.015995644947299225,
"acc_norm": 0.6458100558659218,
"acc_norm_stderr": 0.015995644947299225
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.77491961414791,
"acc_stderr": 0.023720088516179027,
"acc_norm": 0.77491961414791,
"acc_norm_stderr": 0.023720088516179027
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.021185893615225184,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.021185893615225184
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5612777053455019,
"acc_stderr": 0.012673969883493268,
"acc_norm": 0.5612777053455019,
"acc_norm_stderr": 0.012673969883493268
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.02667925227010314,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.02667925227010314
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.01703522925803403,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.01703522925803403
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546195,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546195
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700637,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700637
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276915,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276915
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.017329234580409098,
"mc2": 0.5891645864509103,
"mc2_stderr": 0.015115214729699759
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_lloorree__kssht-dahj-70b
|
[
"region:us"
] |
2023-09-18T22:51:21+00:00
|
{"pretty_name": "Evaluation run of lloorree/kssht-dahj-70b", "dataset_summary": "Dataset automatically created during the evaluation run of model [lloorree/kssht-dahj-70b](https://huggingface.co/lloorree/kssht-dahj-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__kssht-dahj-70b\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T23:50:58.093131](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-dahj-70b/blob/main/results_2023-09-18T23-50-58.093131.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7033014017574061,\n \"acc_stderr\": 0.03081446175839962,\n \"acc_norm\": 0.7072547203046122,\n \"acc_norm_stderr\": 0.03078306684205309,\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5891645864509103,\n \"mc2_stderr\": 0.015115214729699759\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6612627986348123,\n \"acc_stderr\": 0.013830568927974332,\n \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403515\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6867157936666003,\n \"acc_stderr\": 0.0046288092584835265,\n \"acc_norm\": 0.8730332603067118,\n \"acc_norm_stderr\": 0.003322552829608905\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.030643607071677098,\n \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.030643607071677098\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7283018867924528,\n \"acc_stderr\": 0.027377706624670713,\n \"acc_norm\": 0.7283018867924528,\n \"acc_norm_stderr\": 0.027377706624670713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n \"acc_stderr\": 0.032166008088022675,\n \"acc_norm\": 0.8194444444444444,\n \"acc_norm_stderr\": 0.032166008088022675\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.029896145682095455,\n \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.029896145682095455\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.03996629574876719,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.03996629574876719\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.02845038880528437,\n \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.02845038880528437\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9326424870466321,\n \"acc_stderr\": 0.018088393839078912,\n \"acc_norm\": 0.9326424870466321,\n \"acc_norm_stderr\": 0.018088393839078912\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7282051282051282,\n \"acc_stderr\": 0.02255655101013236,\n \"acc_norm\": 0.7282051282051282,\n \"acc_norm_stderr\": 0.02255655101013236\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7815126050420168,\n \"acc_stderr\": 0.02684151432295894,\n \"acc_norm\": 0.7815126050420168,\n \"acc_norm_stderr\": 0.02684151432295894\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.46357615894039733,\n \"acc_stderr\": 0.04071636065944215,\n \"acc_norm\": 0.46357615894039733,\n \"acc_norm_stderr\": 0.04071636065944215\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.908256880733945,\n \"acc_stderr\": 0.012376323409137103,\n \"acc_norm\": 0.908256880733945,\n \"acc_norm_stderr\": 0.012376323409137103\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.03362277436608043,\n \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.03362277436608043\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8987341772151899,\n \"acc_stderr\": 0.019637720526065498,\n \"acc_norm\": 0.8987341772151899,\n \"acc_norm_stderr\": 0.019637720526065498\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7982062780269058,\n \"acc_stderr\": 0.02693611191280227,\n \"acc_norm\": 0.7982062780269058,\n \"acc_norm_stderr\": 0.02693611191280227\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8702290076335878,\n \"acc_stderr\": 0.029473649496907065,\n \"acc_norm\": 0.8702290076335878,\n \"acc_norm_stderr\": 0.029473649496907065\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8735632183908046,\n \"acc_stderr\": 0.011884488905895538,\n \"acc_norm\": 0.8735632183908046,\n \"acc_norm_stderr\": 0.011884488905895538\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6458100558659218,\n \"acc_stderr\": 0.015995644947299225,\n \"acc_norm\": 0.6458100558659218,\n \"acc_norm_stderr\": 0.015995644947299225\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.77491961414791,\n \"acc_stderr\": 0.023720088516179027,\n \"acc_norm\": 0.77491961414791,\n \"acc_norm_stderr\": 0.023720088516179027\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8240740740740741,\n \"acc_stderr\": 0.021185893615225184,\n \"acc_norm\": 0.8240740740740741,\n \"acc_norm_stderr\": 0.021185893615225184\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.029494827600144366,\n \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.029494827600144366\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5612777053455019,\n \"acc_stderr\": 0.012673969883493268,\n \"acc_norm\": 0.5612777053455019,\n \"acc_norm_stderr\": 0.012673969883493268\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.02667925227010314,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.02667925227010314\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.01703522925803403,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.01703522925803403\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546195,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546195\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700637,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700637\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276915,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276915\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n \"mc1_stderr\": 0.017329234580409098,\n \"mc2\": 0.5891645864509103,\n \"mc2_stderr\": 0.015115214729699759\n }\n}\n```", "repo_url": "https://huggingface.co/lloorree/kssht-dahj-70b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|arc:challenge|25_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hellaswag|10_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T23-50-58.093131.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T23_50_58.093131", "path": ["results_2023-09-18T23-50-58.093131.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T23-50-58.093131.parquet"]}]}]}
|
2023-09-18T22:52:21+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of lloorree/kssht-dahj-70b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model lloorree/kssht-dahj-70b on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-18T23:50:58.093131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of lloorree/kssht-dahj-70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lloorree/kssht-dahj-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T23:50:58.093131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of lloorree/kssht-dahj-70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lloorree/kssht-dahj-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T23:50:58.093131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lloorree/kssht-dahj-70b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lloorree/kssht-dahj-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T23:50:58.093131(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
105719773f5b48a9578d1567eda0ddee30ed08c9
|
# Dataset Card for Evaluation run of lloorree/kssht-castor-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/kssht-castor-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/kssht-castor-70b](https://huggingface.co/lloorree/kssht-castor-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__kssht-castor-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T23:54:47.734205](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-castor-70b/blob/main/results_2023-09-18T23-54-47.734205.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7025630433354887,
"acc_stderr": 0.03070323641112233,
"acc_norm": 0.7065431366848456,
"acc_norm_stderr": 0.03067233267965294,
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5630669446354012,
"mc2_stderr": 0.014865953800030475
},
"harness|arc:challenge|25": {
"acc": 0.6501706484641638,
"acc_stderr": 0.01393680921215829,
"acc_norm": 0.6953924914675768,
"acc_norm_stderr": 0.01344952210993249
},
"harness|hellaswag|10": {
"acc": 0.6857199761003784,
"acc_stderr": 0.004632797375289762,
"acc_norm": 0.8753236407090221,
"acc_norm_stderr": 0.003296764320821918
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8486842105263158,
"acc_stderr": 0.02916263159684399,
"acc_norm": 0.8486842105263158,
"acc_norm_stderr": 0.02916263159684399
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.047551296160629475,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.047551296160629475
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6978723404255319,
"acc_stderr": 0.030017554471880557,
"acc_norm": 0.6978723404255319,
"acc_norm_stderr": 0.030017554471880557
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451207,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451207
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.0255428468174005,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.0255428468174005
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172523,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172523
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.029311188674983134,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.029311188674983134
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.023664359402880236,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.023664359402880236
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607555,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607555
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7128205128205128,
"acc_stderr": 0.022939925418530616,
"acc_norm": 0.7128205128205128,
"acc_norm_stderr": 0.022939925418530616
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.027553614467863804,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.027553614467863804
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9027522935779817,
"acc_stderr": 0.012703533408540366,
"acc_norm": 0.9027522935779817,
"acc_norm_stderr": 0.012703533408540366
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.01831885585008968,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.01831885585008968
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8860759493670886,
"acc_stderr": 0.020681745135884565,
"acc_norm": 0.8860759493670886,
"acc_norm_stderr": 0.020681745135884565
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342337,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342337
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035202,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035202
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.035207039905179635,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.035207039905179635
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.03044677768797173,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.03044677768797173
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.01911989279892498,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.01911989279892498
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8684546615581098,
"acc_stderr": 0.01208670521425043,
"acc_norm": 0.8684546615581098,
"acc_norm_stderr": 0.01208670521425043
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.791907514450867,
"acc_stderr": 0.021855255263421795,
"acc_norm": 0.791907514450867,
"acc_norm_stderr": 0.021855255263421795
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5463687150837989,
"acc_stderr": 0.016650437588269076,
"acc_norm": 0.5463687150837989,
"acc_norm_stderr": 0.016650437588269076
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023805186524888156,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023805186524888156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.020263764996385717,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.020263764996385717
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5638297872340425,
"acc_stderr": 0.029583452036284076,
"acc_norm": 0.5638297872340425,
"acc_norm_stderr": 0.029583452036284076
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5541069100391134,
"acc_stderr": 0.012695244711379783,
"acc_norm": 0.5541069100391134,
"acc_norm_stderr": 0.012695244711379783
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.0265565194700415,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.0265565194700415
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.01707737337785693,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.01707737337785693
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8163265306122449,
"acc_stderr": 0.024789071332007636,
"acc_norm": 0.8163265306122449,
"acc_norm_stderr": 0.024789071332007636
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.022076326101824664,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.022076326101824664
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.02567934272327692,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.02567934272327692
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40024479804161567,
"mc1_stderr": 0.01715160555574914,
"mc2": 0.5630669446354012,
"mc2_stderr": 0.014865953800030475
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_lloorree__kssht-castor-70b
|
[
"region:us"
] |
2023-09-18T22:55:11+00:00
|
{"pretty_name": "Evaluation run of lloorree/kssht-castor-70b", "dataset_summary": "Dataset automatically created during the evaluation run of model [lloorree/kssht-castor-70b](https://huggingface.co/lloorree/kssht-castor-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__kssht-castor-70b\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T23:54:47.734205](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-castor-70b/blob/main/results_2023-09-18T23-54-47.734205.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7025630433354887,\n \"acc_stderr\": 0.03070323641112233,\n \"acc_norm\": 0.7065431366848456,\n \"acc_norm_stderr\": 0.03067233267965294,\n \"mc1\": 0.40024479804161567,\n \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5630669446354012,\n \"mc2_stderr\": 0.014865953800030475\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6501706484641638,\n \"acc_stderr\": 0.01393680921215829,\n \"acc_norm\": 0.6953924914675768,\n \"acc_norm_stderr\": 0.01344952210993249\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6857199761003784,\n \"acc_stderr\": 0.004632797375289762,\n \"acc_norm\": 0.8753236407090221,\n \"acc_norm_stderr\": 0.003296764320821918\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8486842105263158,\n \"acc_stderr\": 0.02916263159684399,\n \"acc_norm\": 0.8486842105263158,\n \"acc_norm_stderr\": 0.02916263159684399\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.047551296160629475,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.047551296160629475\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6978723404255319,\n \"acc_stderr\": 0.030017554471880557,\n \"acc_norm\": 0.6978723404255319,\n \"acc_norm_stderr\": 0.030017554471880557\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451207,\n \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451207\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.0255428468174005,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.0255428468174005\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n \"acc_stderr\": 0.021886178567172523,\n \"acc_norm\": 0.8193548387096774,\n \"acc_norm_stderr\": 0.021886178567172523\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n \"acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.029311188674983134,\n \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.029311188674983134\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.023664359402880236,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.023664359402880236\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607555,\n \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607555\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7128205128205128,\n \"acc_stderr\": 0.022939925418530616,\n \"acc_norm\": 0.7128205128205128,\n \"acc_norm_stderr\": 0.022939925418530616\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.027553614467863804,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.027553614467863804\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.9027522935779817,\n \"acc_stderr\": 0.012703533408540366,\n \"acc_norm\": 0.9027522935779817,\n \"acc_norm_stderr\": 0.012703533408540366\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.01831885585008968,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.01831885585008968\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8860759493670886,\n \"acc_stderr\": 0.020681745135884565,\n \"acc_norm\": 0.8860759493670886,\n \"acc_norm_stderr\": 0.020681745135884565\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342337,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342337\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035202,\n \"acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035202\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.035207039905179635,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.035207039905179635\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.03044677768797173,\n \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.03044677768797173\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n \"acc_stderr\": 0.01911989279892498,\n \"acc_norm\": 0.905982905982906,\n \"acc_norm_stderr\": 0.01911989279892498\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8684546615581098,\n \"acc_stderr\": 0.01208670521425043,\n \"acc_norm\": 0.8684546615581098,\n \"acc_norm_stderr\": 0.01208670521425043\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.791907514450867,\n \"acc_stderr\": 0.021855255263421795,\n \"acc_norm\": 0.791907514450867,\n \"acc_norm_stderr\": 0.021855255263421795\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5463687150837989,\n \"acc_stderr\": 0.016650437588269076,\n \"acc_norm\": 0.5463687150837989,\n \"acc_norm_stderr\": 0.016650437588269076\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023805186524888156,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023805186524888156\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.7717041800643086,\n \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.020263764996385717,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.020263764996385717\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5638297872340425,\n \"acc_stderr\": 0.029583452036284076,\n \"acc_norm\": 0.5638297872340425,\n \"acc_norm_stderr\": 0.029583452036284076\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5541069100391134,\n \"acc_stderr\": 0.012695244711379783,\n \"acc_norm\": 0.5541069100391134,\n \"acc_norm_stderr\": 0.012695244711379783\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.0265565194700415,\n \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.0265565194700415\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.01707737337785693,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.01707737337785693\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8163265306122449,\n \"acc_stderr\": 0.024789071332007636,\n \"acc_norm\": 0.8163265306122449,\n \"acc_norm_stderr\": 0.024789071332007636\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n \"acc_stderr\": 0.022076326101824664,\n \"acc_norm\": 0.8905472636815921,\n \"acc_norm_stderr\": 0.022076326101824664\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.02567934272327692,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.02567934272327692\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40024479804161567,\n \"mc1_stderr\": 0.01715160555574914,\n \"mc2\": 0.5630669446354012,\n \"mc2_stderr\": 0.014865953800030475\n }\n}\n```", "repo_url": "https://huggingface.co/lloorree/kssht-castor-70b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|arc:challenge|25_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hellaswag|10_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-18T23-54-47.734205.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_18T23_54_47.734205", "path": ["results_2023-09-18T23-54-47.734205.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T23-54-47.734205.parquet"]}]}]}
|
2023-09-18T22:56:08+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of lloorree/kssht-castor-70b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model lloorree/kssht-castor-70b on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-18T23:54:47.734205(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of lloorree/kssht-castor-70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lloorree/kssht-castor-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T23:54:47.734205(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of lloorree/kssht-castor-70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lloorree/kssht-castor-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T23:54:47.734205(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lloorree/kssht-castor-70b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lloorree/kssht-castor-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T23:54:47.734205(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
59a29e4fbea28785ee19948e9338fa6610325570
|
# Dataset Card for "cifar10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
MaxReynolds/cifar10
|
[
"region:us"
] |
2023-09-18T23:07:40+00:00
|
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": {"class_label": {"names": {"0": "airplane", "1": "automobile", "2": "bird", "3": "cat", "4": "deer", "5": "dog", "6": "frog", "7": "horse", "8": "ship", "9": "truck"}}}}], "splits": [{"name": "train", "num_bytes": 113648310.0, "num_examples": 50000}], "download_size": 119708256, "dataset_size": 113648310.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T23:07:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "cifar10"
More Information needed
|
[
"# Dataset Card for \"cifar10\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"cifar10\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"cifar10\"\n\nMore Information needed"
] |
fa1bb61edc19468b1156cf9436b9e41fabcfcf04
|
# Dataset Card for Evaluation run of lloorree/kssht-euripedes-70b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lloorree/kssht-euripedes-70b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [lloorree/kssht-euripedes-70b](https://huggingface.co/lloorree/kssht-euripedes-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lloorree__kssht-euripedes-70b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-19T00:12:39.048571](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-euripedes-70b/blob/main/results_2023-09-19T00-12-39.048571.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7032771782081723,
"acc_stderr": 0.030834102504125972,
"acc_norm": 0.70714084898032,
"acc_norm_stderr": 0.030804015376568177,
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5551008582453495,
"mc2_stderr": 0.014893190834168417
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.013855831287497723,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.013417519144716413
},
"harness|hellaswag|10": {
"acc": 0.6872137024497113,
"acc_stderr": 0.004626805906522211,
"acc_norm": 0.8759211312487553,
"acc_norm_stderr": 0.0032899775233939097
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996794,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996794
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8092105263157895,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.8092105263157895,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6723404255319149,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.6723404255319149,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6620689655172414,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.6620689655172414,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02559185776138218,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02559185776138218
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.02406315641682252,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.02406315641682252
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223157,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223157
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7230769230769231,
"acc_stderr": 0.022688042352424994,
"acc_norm": 0.7230769230769231,
"acc_norm_stderr": 0.022688042352424994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.027722065493361276,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.027722065493361276
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.47019867549668876,
"acc_stderr": 0.040752249922169775,
"acc_norm": 0.47019867549668876,
"acc_norm_stderr": 0.040752249922169775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8972477064220183,
"acc_stderr": 0.013018246509173768,
"acc_norm": 0.8972477064220183,
"acc_norm_stderr": 0.013018246509173768
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997866,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997866
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.020343400734868834,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.020343400734868834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7937219730941704,
"acc_stderr": 0.02715715047956382,
"acc_norm": 0.7937219730941704,
"acc_norm_stderr": 0.02715715047956382
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8778625954198473,
"acc_stderr": 0.028718776889342337,
"acc_norm": 0.8778625954198473,
"acc_norm_stderr": 0.028718776889342337
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8677685950413223,
"acc_stderr": 0.03092278832044579,
"acc_norm": 0.8677685950413223,
"acc_norm_stderr": 0.03092278832044579
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.03434300243631001,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.03434300243631001
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.803680981595092,
"acc_stderr": 0.031207970394709218,
"acc_norm": 0.803680981595092,
"acc_norm_stderr": 0.031207970394709218
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.039166677628225836,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.039166677628225836
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8620689655172413,
"acc_stderr": 0.012331009307795656,
"acc_norm": 0.8620689655172413,
"acc_norm_stderr": 0.012331009307795656
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7947976878612717,
"acc_stderr": 0.021742519835276274,
"acc_norm": 0.7947976878612717,
"acc_norm_stderr": 0.021742519835276274
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5575418994413408,
"acc_stderr": 0.016611393687268574,
"acc_norm": 0.5575418994413408,
"acc_norm_stderr": 0.016611393687268574
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7679738562091504,
"acc_stderr": 0.024170840879340873,
"acc_norm": 0.7679738562091504,
"acc_norm_stderr": 0.024170840879340873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7717041800643086,
"acc_stderr": 0.0238393033113982,
"acc_norm": 0.7717041800643086,
"acc_norm_stderr": 0.0238393033113982
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8395061728395061,
"acc_stderr": 0.02042395535477803,
"acc_norm": 0.8395061728395061,
"acc_norm_stderr": 0.02042395535477803
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.02965823509766691,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.02965823509766691
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5580182529335072,
"acc_stderr": 0.012683972513598827,
"acc_norm": 0.5580182529335072,
"acc_norm_stderr": 0.012683972513598827
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887657,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887657
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.01724238582877962,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.01724238582877962
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.7545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.025000256039546188,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.025000256039546188
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8955223880597015,
"acc_stderr": 0.021628920516700643,
"acc_norm": 0.8955223880597015,
"acc_norm_stderr": 0.021628920516700643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8596491228070176,
"acc_stderr": 0.0266405825391332,
"acc_norm": 0.8596491228070176,
"acc_norm_stderr": 0.0266405825391332
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3953488372093023,
"mc1_stderr": 0.017115815632418197,
"mc2": 0.5551008582453495,
"mc2_stderr": 0.014893190834168417
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_lloorree__kssht-euripedes-70b
|
[
"region:us"
] |
2023-09-18T23:13:02+00:00
|
{"pretty_name": "Evaluation run of lloorree/kssht-euripedes-70b", "dataset_summary": "Dataset automatically created during the evaluation run of model [lloorree/kssht-euripedes-70b](https://huggingface.co/lloorree/kssht-euripedes-70b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lloorree__kssht-euripedes-70b\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-19T00:12:39.048571](https://huggingface.co/datasets/open-llm-leaderboard/details_lloorree__kssht-euripedes-70b/blob/main/results_2023-09-19T00-12-39.048571.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7032771782081723,\n \"acc_stderr\": 0.030834102504125972,\n \"acc_norm\": 0.70714084898032,\n \"acc_norm_stderr\": 0.030804015376568177,\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5551008582453495,\n \"mc2_stderr\": 0.014893190834168417\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.013855831287497723,\n \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.013417519144716413\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6872137024497113,\n \"acc_stderr\": 0.004626805906522211,\n \"acc_norm\": 0.8759211312487553,\n \"acc_norm_stderr\": 0.0032899775233939097\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996794,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996794\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8092105263157895,\n \"acc_stderr\": 0.031975658210325,\n \"acc_norm\": 0.8092105263157895,\n \"acc_norm_stderr\": 0.031975658210325\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6723404255319149,\n \"acc_stderr\": 0.030683020843231004,\n \"acc_norm\": 0.6723404255319149,\n \"acc_norm_stderr\": 0.030683020843231004\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6620689655172414,\n \"acc_stderr\": 0.039417076320648906,\n \"acc_norm\": 0.6620689655172414,\n \"acc_norm_stderr\": 0.039417076320648906\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.02559185776138218,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02559185776138218\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n \"acc_stderr\": 0.02218571009225225,\n \"acc_norm\": 0.8129032258064516,\n \"acc_norm_stderr\": 0.02218571009225225\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n \"acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8686868686868687,\n \"acc_stderr\": 0.02406315641682252,\n \"acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.02406315641682252\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223157,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223157\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7230769230769231,\n \"acc_stderr\": 0.022688042352424994,\n \"acc_norm\": 0.7230769230769231,\n \"acc_norm_stderr\": 0.022688042352424994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361276,\n \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361276\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.47019867549668876,\n \"acc_stderr\": 0.040752249922169775,\n \"acc_norm\": 0.47019867549668876,\n \"acc_norm_stderr\": 0.040752249922169775\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8972477064220183,\n \"acc_stderr\": 0.013018246509173768,\n \"acc_norm\": 0.8972477064220183,\n \"acc_norm_stderr\": 0.013018246509173768\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997866,\n \"acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997866\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.890295358649789,\n \"acc_stderr\": 0.020343400734868834,\n \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.020343400734868834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7937219730941704,\n \"acc_stderr\": 0.02715715047956382,\n \"acc_norm\": 0.7937219730941704,\n \"acc_norm_stderr\": 0.02715715047956382\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8778625954198473,\n \"acc_stderr\": 0.028718776889342337,\n \"acc_norm\": 0.8778625954198473,\n \"acc_norm_stderr\": 0.028718776889342337\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8677685950413223,\n \"acc_stderr\": 0.03092278832044579,\n \"acc_norm\": 0.8677685950413223,\n \"acc_norm_stderr\": 0.03092278832044579\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.03434300243631001,\n \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.03434300243631001\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.803680981595092,\n \"acc_stderr\": 0.031207970394709218,\n \"acc_norm\": 0.803680981595092,\n \"acc_norm_stderr\": 0.031207970394709218\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.039166677628225836,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.039166677628225836\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8620689655172413,\n \"acc_stderr\": 0.012331009307795656,\n \"acc_norm\": 0.8620689655172413,\n \"acc_norm_stderr\": 0.012331009307795656\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7947976878612717,\n \"acc_stderr\": 0.021742519835276274,\n \"acc_norm\": 0.7947976878612717,\n \"acc_norm_stderr\": 0.021742519835276274\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5575418994413408,\n \"acc_stderr\": 0.016611393687268574,\n \"acc_norm\": 0.5575418994413408,\n \"acc_norm_stderr\": 0.016611393687268574\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7679738562091504,\n \"acc_stderr\": 0.024170840879340873,\n \"acc_norm\": 0.7679738562091504,\n \"acc_norm_stderr\": 0.024170840879340873\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7717041800643086,\n \"acc_stderr\": 0.0238393033113982,\n \"acc_norm\": 0.7717041800643086,\n \"acc_norm_stderr\": 0.0238393033113982\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.8395061728395061,\n \"acc_stderr\": 0.02042395535477803,\n \"acc_norm\": 0.8395061728395061,\n \"acc_norm_stderr\": 0.02042395535477803\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.02965823509766691,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.02965823509766691\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5580182529335072,\n \"acc_stderr\": 0.012683972513598827,\n \"acc_norm\": 0.5580182529335072,\n \"acc_norm_stderr\": 0.012683972513598827\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887657,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887657\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.01724238582877962,\n \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.01724238582877962\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7545454545454545,\n \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.7545454545454545,\n \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.025000256039546188,\n \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.025000256039546188\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8955223880597015,\n \"acc_stderr\": 0.021628920516700643,\n \"acc_norm\": 0.8955223880597015,\n \"acc_norm_stderr\": 0.021628920516700643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8596491228070176,\n \"acc_stderr\": 0.0266405825391332,\n \"acc_norm\": 0.8596491228070176,\n \"acc_norm_stderr\": 0.0266405825391332\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3953488372093023,\n \"mc1_stderr\": 0.017115815632418197,\n \"mc2\": 0.5551008582453495,\n \"mc2_stderr\": 0.014893190834168417\n }\n}\n```", "repo_url": "https://huggingface.co/lloorree/kssht-euripedes-70b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|arc:challenge|25_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hellaswag|10_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-19T00-12-39.048571.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_19T00_12_39.048571", "path": ["results_2023-09-19T00-12-39.048571.parquet"]}, {"split": "latest", "path": ["results_2023-09-19T00-12-39.048571.parquet"]}]}]}
|
2023-09-18T23:14:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of lloorree/kssht-euripedes-70b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model lloorree/kssht-euripedes-70b on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-19T00:12:39.048571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of lloorree/kssht-euripedes-70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lloorree/kssht-euripedes-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-19T00:12:39.048571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of lloorree/kssht-euripedes-70b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model lloorree/kssht-euripedes-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-19T00:12:39.048571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of lloorree/kssht-euripedes-70b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model lloorree/kssht-euripedes-70b on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-19T00:12:39.048571(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
619b8c97cba00103c9453a2bb698529637f42935
|
# Dataset Card for "cifar10_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
MaxReynolds/cifar10_v2
|
[
"region:us"
] |
2023-09-18T23:18:11+00:00
|
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "airplane", "1": "automobile", "2": "bird", "3": "cat", "4": "deer", "5": "dog", "6": "frog", "7": "horse", "8": "ship", "9": "truck"}}}}], "splits": [{"name": "train", "num_bytes": 113648310.0, "num_examples": 50000}], "download_size": 119709270, "dataset_size": 113648310.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T23:18:17+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "cifar10_v2"
More Information needed
|
[
"# Dataset Card for \"cifar10_v2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"cifar10_v2\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"cifar10_v2\"\n\nMore Information needed"
] |
28cb126ed27bd5d57a663a6eb5304c7be89e92f1
|
# Dataset Card for "908725e5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
result-kand2-sdxl-wuerst-karlo/908725e5
|
[
"region:us"
] |
2023-09-18T23:19:23+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 161, "num_examples": 10}], "download_size": 1318, "dataset_size": 161}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T23:19:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "908725e5"
More Information needed
|
[
"# Dataset Card for \"908725e5\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"908725e5\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"908725e5\"\n\nMore Information needed"
] |
5fd31aa0bf20297545348c86d07392a7208acce9
|
# Dataset Card for "e87ec3b2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
result-kand2-sdxl-wuerst-karlo/e87ec3b2
|
[
"region:us"
] |
2023-09-18T23:21:50+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 153, "num_examples": 10}], "download_size": 1306, "dataset_size": 153}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T23:21:50+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "e87ec3b2"
More Information needed
|
[
"# Dataset Card for \"e87ec3b2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"e87ec3b2\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"e87ec3b2\"\n\nMore Information needed"
] |
d44e857c79f43da9e3321ab26c624d38cfe99972
|
# Dataset Card for "MetalDam_Cropped"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
ironchanchellor/MetalDam_Cropped
|
[
"region:us"
] |
2023-09-18T23:22:21+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "pixel_values", "dtype": "image"}, {"name": "label", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 43505113.0, "num_examples": 124}, {"name": "validation", "num_bytes": 11683804.0, "num_examples": 32}], "download_size": 55199351, "dataset_size": 55188917.0}}
|
2023-09-18T23:24:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "MetalDam_Cropped"
More Information needed
|
[
"# Dataset Card for \"MetalDam_Cropped\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"MetalDam_Cropped\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"MetalDam_Cropped\"\n\nMore Information needed"
] |
8e741c26ab309bc3c41bead3ff6ee8ddc761ca95
|
# Dataset Card for "9e7f6f37"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
result-kand2-sdxl-wuerst-karlo/9e7f6f37
|
[
"region:us"
] |
2023-09-18T23:24:02+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 152, "num_examples": 10}], "download_size": 1303, "dataset_size": 152}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-18T23:24:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "9e7f6f37"
More Information needed
|
[
"# Dataset Card for \"9e7f6f37\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"9e7f6f37\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"9e7f6f37\"\n\nMore Information needed"
] |
a6c42d6dcd6ea62981974f3297f3921d8c7454d0
|
# Dataset Card for Evaluation run of Sao10K/Euryale-L2-70B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Sao10K/Euryale-L2-70B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Sao10K/Euryale-L2-70B](https://huggingface.co/Sao10K/Euryale-L2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Euryale-L2-70B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T17:20:57.246937](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Euryale-L2-70B/blob/main/results_2023-10-29T17-20-57.246937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893235,
"f1": 0.06751782718120815,
"f1_stderr": 0.0013937914519446145,
"acc": 0.5430945808722376,
"acc_stderr": 0.011469812310058832
},
"harness|drop|3": {
"em": 0.0024119127516778523,
"em_stderr": 0.0005023380498893235,
"f1": 0.06751782718120815,
"f1_stderr": 0.0013937914519446145
},
"harness|gsm8k|5": {
"acc": 0.265352539802881,
"acc_stderr": 0.012161675464069675
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047987
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Sao10K__Euryale-L2-70B
|
[
"region:us"
] |
2023-09-18T23:30:46+00:00
|
{"pretty_name": "Evaluation run of Sao10K/Euryale-L2-70B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Euryale-L2-70B](https://huggingface.co/Sao10K/Euryale-L2-70B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Euryale-L2-70B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T17:20:57.246937](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Euryale-L2-70B/blob/main/results_2023-10-29T17-20-57.246937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893235,\n \"f1\": 0.06751782718120815,\n \"f1_stderr\": 0.0013937914519446145,\n \"acc\": 0.5430945808722376,\n \"acc_stderr\": 0.011469812310058832\n },\n \"harness|drop|3\": {\n \"em\": 0.0024119127516778523,\n \"em_stderr\": 0.0005023380498893235,\n \"f1\": 0.06751782718120815,\n \"f1_stderr\": 0.0013937914519446145\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.265352539802881,\n \"acc_stderr\": 0.012161675464069675\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047987\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Euryale-L2-70B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|arc:challenge|25_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T17_20_57.246937", "path": ["**/details_harness|drop|3_2023-10-29T17-20-57.246937.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T17-20-57.246937.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T17_20_57.246937", "path": ["**/details_harness|gsm8k|5_2023-10-29T17-20-57.246937.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T17-20-57.246937.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hellaswag|10_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-19T00-30-23.278534.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-19T00-30-23.278534.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T17_20_57.246937", "path": ["**/details_harness|winogrande|5_2023-10-29T17-20-57.246937.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T17-20-57.246937.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_19T00_30_23.278534", "path": ["results_2023-09-19T00-30-23.278534.parquet"]}, {"split": "2023_10_29T17_20_57.246937", "path": ["results_2023-10-29T17-20-57.246937.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T17-20-57.246937.parquet"]}]}]}
|
2023-10-29T17:21:09+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Sao10K/Euryale-L2-70B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Sao10K/Euryale-L2-70B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-29T17:20:57.246937(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Sao10K/Euryale-L2-70B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Euryale-L2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T17:20:57.246937(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Sao10K/Euryale-L2-70B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Euryale-L2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T17:20:57.246937(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Sao10K/Euryale-L2-70B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Sao10K/Euryale-L2-70B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T17:20:57.246937(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d5f6bcd0659f9149bce98a774e7d92844b49e612
|
# Dataset Card for "dataset_pfs_hr_by_subgroup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
yicozy/dataset_pfs_hr_by_subgroup
|
[
"region:us"
] |
2023-09-18T23:46:16+00:00
|
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6991314, "num_examples": 8668}], "download_size": 0, "dataset_size": 6991314}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-19T00:04:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "dataset_pfs_hr_by_subgroup"
More Information needed
|
[
"# Dataset Card for \"dataset_pfs_hr_by_subgroup\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"dataset_pfs_hr_by_subgroup\"\n\nMore Information needed"
] |
[
6,
22
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"dataset_pfs_hr_by_subgroup\"\n\nMore Information needed"
] |
5ae368209a77f9b687c55d7dc6bd1685d6e96fbc
|
# Dataset Card for "whisper-v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
MathiasFoster/whisper-v4
|
[
"region:us"
] |
2023-09-18T23:52:59+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "transcription", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19948406.0, "num_examples": 324}, {"name": "test", "num_bytes": 607133.0, "num_examples": 10}], "download_size": 20047841, "dataset_size": 20555539.0}}
|
2023-09-18T23:53:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "whisper-v4"
More Information needed
|
[
"# Dataset Card for \"whisper-v4\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"whisper-v4\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"whisper-v4\"\n\nMore Information needed"
] |
0dbdabe5e4fdd7610414a98b31163725dd8b9d22
|
# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [ICBU-NPU/FashionGPT-70B-V1.1](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.1_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-09T05:03:34.118440](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.1_public/blob/main/results_2023-11-09T05-03-34.118440.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.24800755033557048,
"em_stderr": 0.004422612027820539,
"f1": 0.40814072986577404,
"f1_stderr": 0.004137188687530774,
"acc": 0.6205347980131322,
"acc_stderr": 0.012108370161317753
},
"harness|drop|3": {
"em": 0.24800755033557048,
"em_stderr": 0.004422612027820539,
"f1": 0.40814072986577404,
"f1_stderr": 0.004137188687530774
},
"harness|gsm8k|5": {
"acc": 0.41470811220621684,
"acc_stderr": 0.013570623842304504
},
"harness|winogrande|5": {
"acc": 0.8263614838200474,
"acc_stderr": 0.010646116480331001
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.1
|
[
"region:us"
] |
2023-09-19T00:00:38+00:00
|
{"pretty_name": "Evaluation run of ICBU-NPU/FashionGPT-70B-V1.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [ICBU-NPU/FashionGPT-70B-V1.1](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.1_public\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-11-09T05:03:34.118440](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1.1_public/blob/main/results_2023-11-09T05-03-34.118440.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.24800755033557048,\n \"em_stderr\": 0.004422612027820539,\n \"f1\": 0.40814072986577404,\n \"f1_stderr\": 0.004137188687530774,\n \"acc\": 0.6205347980131322,\n \"acc_stderr\": 0.012108370161317753\n },\n \"harness|drop|3\": {\n \"em\": 0.24800755033557048,\n \"em_stderr\": 0.004422612027820539,\n \"f1\": 0.40814072986577404,\n \"f1_stderr\": 0.004137188687530774\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41470811220621684,\n \"acc_stderr\": 0.013570623842304504\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480331001\n }\n}\n```", "repo_url": "https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_drop_3", "data_files": [{"split": "2023_11_09T05_03_34.118440", "path": ["**/details_harness|drop|3_2023-11-09T05-03-34.118440.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-11-09T05-03-34.118440.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_11_09T05_03_34.118440", "path": ["**/details_harness|gsm8k|5_2023-11-09T05-03-34.118440.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-11-09T05-03-34.118440.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_11_09T05_03_34.118440", "path": ["**/details_harness|winogrande|5_2023-11-09T05-03-34.118440.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-11-09T05-03-34.118440.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_11_09T05_03_34.118440", "path": ["results_2023-11-09T05-03-34.118440.parquet"]}, {"split": "latest", "path": ["results_2023-11-09T05-03-34.118440.parquet"]}]}]}
|
2023-12-01T14:54:18+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ICBU-NPU/FashionGPT-70B-V1.1 on the Open LLM Leaderboard.
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-11-09T05:03:34.118440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ICBU-NPU/FashionGPT-70B-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-09T05:03:34.118440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ICBU-NPU/FashionGPT-70B-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-11-09T05:03:34.118440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
174,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ICBU-NPU/FashionGPT-70B-V1.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-11-09T05:03:34.118440(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d95e6ce58939265f266d0ddc05acbb77ef0a3041
|
# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [ICBU-NPU/FashionGPT-70B-V1](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T15:24:49.736716](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1/blob/main/results_2023-10-29T15-24-49.736716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666993,
"f1": 0.07125104865771813,
"f1_stderr": 0.0014102826102321945,
"acc": 0.5589478168926856,
"acc_stderr": 0.011387742640607
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666993,
"f1": 0.07125104865771813,
"f1_stderr": 0.0014102826102321945
},
"harness|gsm8k|5": {
"acc": 0.2812736921910538,
"acc_stderr": 0.012384789310940239
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273764
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1
|
[
"region:us"
] |
2023-09-19T00:12:41+00:00
|
{"pretty_name": "Evaluation run of ICBU-NPU/FashionGPT-70B-V1", "dataset_summary": "Dataset automatically created during the evaluation run of model [ICBU-NPU/FashionGPT-70B-V1](https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-29T15:24:49.736716](https://huggingface.co/datasets/open-llm-leaderboard/details_ICBU-NPU__FashionGPT-70B-V1/blob/main/results_2023-10-29T15-24-49.736716.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462666993,\n \"f1\": 0.07125104865771813,\n \"f1_stderr\": 0.0014102826102321945,\n \"acc\": 0.5589478168926856,\n \"acc_stderr\": 0.011387742640607\n },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462666993,\n \"f1\": 0.07125104865771813,\n \"f1_stderr\": 0.0014102826102321945\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2812736921910538,\n \"acc_stderr\": 0.012384789310940239\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273764\n }\n}\n```", "repo_url": "https://huggingface.co/ICBU-NPU/FashionGPT-70B-V1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|arc:challenge|25_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_29T15_24_49.736716", "path": ["**/details_harness|drop|3_2023-10-29T15-24-49.736716.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-29T15-24-49.736716.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_29T15_24_49.736716", "path": ["**/details_harness|gsm8k|5_2023-10-29T15-24-49.736716.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-29T15-24-49.736716.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hellaswag|10_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-19T01-12-17.792946.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-19T01-12-17.792946.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_29T15_24_49.736716", "path": ["**/details_harness|winogrande|5_2023-10-29T15-24-49.736716.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-29T15-24-49.736716.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_19T01_12_17.792946", "path": ["results_2023-09-19T01-12-17.792946.parquet"]}, {"split": "2023_10_29T15_24_49.736716", "path": ["results_2023-10-29T15-24-49.736716.parquet"]}, {"split": "latest", "path": ["results_2023-10-29T15-24-49.736716.parquet"]}]}]}
|
2023-10-29T15:25:02+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ICBU-NPU/FashionGPT-70B-V1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-29T15:24:49.736716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ICBU-NPU/FashionGPT-70B-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T15:24:49.736716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ICBU-NPU/FashionGPT-70B-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-29T15:24:49.736716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
25,
31,
173,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ICBU-NPU/FashionGPT-70B-V1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ICBU-NPU/FashionGPT-70B-V1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-29T15:24:49.736716(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
7f74b9668f126d023fda7fa0555ce9878f37b092
|
language:
- pt
task_categories:
- summarization
not official
|
godoyj/temario
|
[
"region:us"
] |
2023-09-19T00:28:46+00:00
|
{}
|
2023-09-19T00:37:27+00:00
|
[] |
[] |
TAGS
#region-us
|
language:
- pt
task_categories:
- summarization
not official
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
133e14b7cf709605df74b80b355f299f1433bab4
|
# Dataset Card for "ad45b2bb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
result-kand2-sdxl-wuerst-karlo/ad45b2bb
|
[
"region:us"
] |
2023-09-19T00:34:45+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 188, "num_examples": 10}], "download_size": 1388, "dataset_size": 188}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-19T00:34:46+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "ad45b2bb"
More Information needed
|
[
"# Dataset Card for \"ad45b2bb\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"ad45b2bb\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"ad45b2bb\"\n\nMore Information needed"
] |
a7a45e76fbef1d7c9013dc4d75b17634a65d69f0
|
# 悟道(WuDao)資料集
非原製作者,僅搬移。
此資料集下載約60GB,解壓縮後約220GB。
### 原始連結
[Science Data Bank](https://www.scidb.cn/en/detail?dataSetId=c6a3fe684227415a9db8e21bac4a15ab)
## 使用
```bash
sudo apt install unrar
pip install patool wget opencc
```
```python
from datasets import load_dataset
# 簡中
load_dataset("p208p2002/wudao",streaming=True,split="zhs")
# 繁中 (使用opencc轉換)
load_dataset("p208p2002/wudao",streaming=True,split="zht")
```
## 清除資料
當下載失敗的時候請手動清除資料
```bash
rm -rf ~/.cache/wudao_dataset
```
## 資料類別統計
```json
{
"_total": 59100001,
"豆瓣话题": 209027,
"科技": 1278068,
"经济": 1096215,
"汽车": 1368193,
"娱乐": 1581947,
"农业": 1129758,
"军事": 420949,
"社会": 446228,
"游戏": 754703,
"教育": 1133453,
"体育": 660858,
"旅行": 821573,
"国际": 630386,
"房产": 387786,
"文化": 710648,
"法律": 36585,
"股票": 1205,
"博客": 15467790,
"日报": 16971,
"评论": 13867,
"孕育常识": 48291,
"健康": 15291,
"财经": 54656,
"医学问答": 314771,
"资讯": 1066180,
"科普文章": 60581,
"百科": 27273280,
"酒业": 287,
"经验": 609195,
"新闻": 846810,
"小红书攻略": 185379,
"生活": 23,
"网页文本": 115830,
"观点": 1268,
"海外": 4,
"户外": 5,
"美容": 7,
"理论": 247,
"天气": 540,
"文旅": 2999,
"信托": 62,
"保险": 70,
"水利资讯": 17,
"时尚": 1123,
"亲子": 39,
"百家号文章": 335591,
"黄金": 216,
"党建": 1,
"期货": 330,
"快讯": 41,
"国内": 15,
"国学": 614,
"公益": 15,
"能源": 7,
"创新": 6
}
```
## Cite
```
@misc{ c6a3fe684227415a9db8e21bac4a15ab,
author = {Zhao Xue and Hanyu Zhao and Sha Yuan and Yequan Wang},
title = {{WuDaoCorpora Text}},
year = 2022,
month = dec,
publisher = {Science Data Bank},
version = {V1},
doi = {10.57760/sciencedb.o00126.00004},
url = https://doi.org/10.57760/sciencedb.o00126.00004
}
```
|
p208p2002/wudao
|
[
"task_categories:text-generation",
"size_categories:n>1T",
"language:zh",
"region:us"
] |
2023-09-19T00:35:45+00:00
|
{"language": ["zh"], "size_categories": ["n>1T"], "task_categories": ["text-generation"]}
|
2023-11-02T09:06:54+00:00
|
[] |
[
"zh"
] |
TAGS
#task_categories-text-generation #size_categories-n>1T #language-Chinese #region-us
|
# 悟道(WuDao)資料集
非原製作者,僅搬移。
此資料集下載約60GB,解壓縮後約220GB。
### 原始連結
Science Data Bank
## 使用
## 清除資料
當下載失敗的時候請手動清除資料
## 資料類別統計
## Cite
|
[
"# 悟道(WuDao)資料集\n非原製作者,僅搬移。\n\n此資料集下載約60GB,解壓縮後約220GB。",
"### 原始連結\nScience Data Bank",
"## 使用",
"## 清除資料\n當下載失敗的時候請手動清除資料",
"## 資料類別統計",
"## Cite"
] |
[
"TAGS\n#task_categories-text-generation #size_categories-n>1T #language-Chinese #region-us \n",
"# 悟道(WuDao)資料集\n非原製作者,僅搬移。\n\n此資料集下載約60GB,解壓縮後約220GB。",
"### 原始連結\nScience Data Bank",
"## 使用",
"## 清除資料\n當下載失敗的時候請手動清除資料",
"## 資料類別統計",
"## Cite"
] |
[
32,
37,
8,
3,
14,
5,
3
] |
[
"passage: TAGS\n#task_categories-text-generation #size_categories-n>1T #language-Chinese #region-us \n# 悟道(WuDao)資料集\n非原製作者,僅搬移。\n\n此資料集下載約60GB,解壓縮後約220GB。### 原始連結\nScience Data Bank## 使用## 清除資料\n當下載失敗的時候請手動清除資料## 資料類別統計## Cite"
] |
68f13ded953e3bd69168c07b68b7e78d87e57b86
|
# MATH dataset
Original version: https://huggingface.co/datasets/lighteval/MATH
Translation source code: https://github.com/martinakaduc/ura-llama/tree/main/dataset_scripts/custom_datasets
|
ura-hcmut/MATH
|
[
"task_categories:text2text-generation",
"language:vi",
"license:cc-by-nc-sa-4.0",
"region:us"
] |
2023-09-19T00:55:00+00:00
|
{"language": ["vi"], "license": "cc-by-nc-sa-4.0", "task_categories": ["text2text-generation"], "configs": [{"config_name": "gcp", "data_files": [{"split": "train", "path": "MATH_gcp_training.csv"}, {"split": "test", "path": "MATH_gcp.csv"}]}, {"config_name": "azr", "data_files": [{"split": "train", "path": "MATH_azr_training.csv"}, {"split": "test", "path": "MATH_azr.csv"}]}]}
|
2024-01-29T16:35:34+00:00
|
[] |
[
"vi"
] |
TAGS
#task_categories-text2text-generation #language-Vietnamese #license-cc-by-nc-sa-4.0 #region-us
|
# MATH dataset
Original version: URL
Translation source code: URL
|
[
"# MATH dataset\n\nOriginal version: URL\n\nTranslation source code: URL"
] |
[
"TAGS\n#task_categories-text2text-generation #language-Vietnamese #license-cc-by-nc-sa-4.0 #region-us \n",
"# MATH dataset\n\nOriginal version: URL\n\nTranslation source code: URL"
] |
[
39,
14
] |
[
"passage: TAGS\n#task_categories-text2text-generation #language-Vietnamese #license-cc-by-nc-sa-4.0 #region-us \n# MATH dataset\n\nOriginal version: URL\n\nTranslation source code: URL"
] |
b8993afcbe4ca1fca25c88c50644c01c0d6d6bca
|
# Synthetic reasoning dataset
Original version:
- https://huggingface.co/datasets/lighteval/synthetic_reasoning
Translation source code: https://github.com/martinakaduc/ura-llama/tree/main/dataset_scripts/custom_datasets
|
ura-hcmut/synthetic_reasoning
|
[
"task_categories:text2text-generation",
"language:vi",
"license:cc-by-nc-sa-4.0",
"region:us"
] |
2023-09-19T01:01:51+00:00
|
{"language": ["vi"], "license": "cc-by-nc-sa-4.0", "task_categories": ["text2text-generation"], "configs": [{"config_name": "induction_gcp", "data_files": [{"split": "train", "path": "synthetic_reasoning_gcp_induction_training.csv"}, {"split": "test", "path": "synthetic_reasoning_gcp_induction.csv"}]}, {"config_name": "induction_azr", "data_files": [{"split": "train", "path": "synthetic_reasoning_azr_induction_training.csv"}, {"split": "test", "path": "synthetic_reasoning_azr_induction.csv"}]}, {"config_name": "pattern_match_gcp", "data_files": [{"split": "train", "path": "synthetic_reasoning_gcp_pattern_match_training.csv"}, {"split": "test", "path": "synthetic_reasoning_gcp_pattern_match.csv"}]}, {"config_name": "pattern_match_azr", "data_files": [{"split": "train", "path": "synthetic_reasoning_azr_pattern_match_training.csv"}, {"split": "test", "path": "synthetic_reasoning_azr_pattern_match.csv"}]}, {"config_name": "variable_substitution_gcp", "data_files": [{"split": "train", "path": "synthetic_reasoning_gcp_variable_substitution_training.csv"}, {"split": "test", "path": "synthetic_reasoning_gcp_variable_substitution.csv"}]}, {"config_name": "variable_substitution_azr", "data_files": [{"split": "train", "path": "synthetic_reasoning_azr_variable_substitution_training.csv"}, {"split": "test", "path": "synthetic_reasoning_azr_variable_substitution.csv"}]}]}
|
2023-09-19T01:37:10+00:00
|
[] |
[
"vi"
] |
TAGS
#task_categories-text2text-generation #language-Vietnamese #license-cc-by-nc-sa-4.0 #region-us
|
# Synthetic reasoning dataset
Original version:
- URL
Translation source code: URL
|
[
"# Synthetic reasoning dataset\n\nOriginal version:\n- URL\n\nTranslation source code: URL"
] |
[
"TAGS\n#task_categories-text2text-generation #language-Vietnamese #license-cc-by-nc-sa-4.0 #region-us \n",
"# Synthetic reasoning dataset\n\nOriginal version:\n- URL\n\nTranslation source code: URL"
] |
[
39,
18
] |
[
"passage: TAGS\n#task_categories-text2text-generation #language-Vietnamese #license-cc-by-nc-sa-4.0 #region-us \n# Synthetic reasoning dataset\n\nOriginal version:\n- URL\n\nTranslation source code: URL"
] |
44e21d0d583347e2473bb1562af9763ea5c8d656
|
# Dataset Card for "donut-dummy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
arifzanko/donut-dummy
|
[
"region:us"
] |
2023-09-19T01:08:39+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "ground_truth", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 103705.0, "num_examples": 2}, {"name": "validation", "num_bytes": 46768.0, "num_examples": 1}, {"name": "test", "num_bytes": 48489.0, "num_examples": 1}], "download_size": 109961, "dataset_size": 198962.0}}
|
2023-09-19T01:38:27+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "donut-dummy"
More Information needed
|
[
"# Dataset Card for \"donut-dummy\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"donut-dummy\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"donut-dummy\"\n\nMore Information needed"
] |
a9a8284468be36a4a3aaaa1fa76be989d0164deb
|
# Dataset Card for "data_test_whisper_large_v2_legit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
linhqyy/data_test_whisper_large_v2_legit
|
[
"region:us"
] |
2023-09-19T01:16:51+00:00
|
{"dataset_info": {"features": [{"name": "audio", "dtype": {"audio": {"sampling_rate": 16000}}}, {"name": "id", "dtype": "string"}, {"name": "pred_str", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 174278487.625, "num_examples": 1299}], "download_size": 164191689, "dataset_size": 174278487.625}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-19T01:17:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "data_test_whisper_large_v2_legit"
More Information needed
|
[
"# Dataset Card for \"data_test_whisper_large_v2_legit\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"data_test_whisper_large_v2_legit\"\n\nMore Information needed"
] |
[
6,
26
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"data_test_whisper_large_v2_legit\"\n\nMore Information needed"
] |
f8139a1cc525ef6a0c946d2a9c407d402bfcfc19
|
# Dataset Card for "squad_id_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
tyzhu/squad_id_train_10_eval_10
|
[
"region:us"
] |
2023-09-19T01:18:50+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 237881, "num_examples": 150}, {"name": "validation", "num_bytes": 59860, "num_examples": 48}], "download_size": 72567, "dataset_size": 297741}}
|
2023-09-19T01:18:57+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "squad_id_train_10_eval_10"
More Information needed
|
[
"# Dataset Card for \"squad_id_train_10_eval_10\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_id_train_10_eval_10\"\n\nMore Information needed"
] |
[
6,
24
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_id_train_10_eval_10\"\n\nMore Information needed"
] |
4e0323886583de62a35d357e64e5d8fc4916b3b4
|
# Dataset Card for "4b9958b5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
result-kand2-sdxl-wuerst-karlo/4b9958b5
|
[
"region:us"
] |
2023-09-19T01:29:31+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 167, "num_examples": 10}], "download_size": 1331, "dataset_size": 167}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-19T01:29:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "4b9958b5"
More Information needed
|
[
"# Dataset Card for \"4b9958b5\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"4b9958b5\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"4b9958b5\"\n\nMore Information needed"
] |
cd10c69b44e06c57750189e61fe39c33fd9fd546
|
# Synthetic reasoning dataset
Original version:
- https://huggingface.co/datasets/lighteval/synthetic_reasoning_natural
Translation source code: https://github.com/martinakaduc/ura-llama/tree/main/dataset_scripts/custom_datasets
|
ura-hcmut/synthetic_reasoning_natural
|
[
"task_categories:text2text-generation",
"language:vi",
"license:cc-by-nc-sa-4.0",
"region:us"
] |
2023-09-19T01:34:10+00:00
|
{"language": ["vi"], "license": "cc-by-nc-sa-4.0", "task_categories": ["text2text-generation"], "configs": [{"config_name": "easy_gcp", "data_files": [{"split": "train", "path": "synthetic_reasoning_gcp_natural_training.csv"}, {"split": "test", "path": "synthetic_reasoning_gcp_natural.csv"}]}, {"config_name": "easy_azr", "data_files": [{"split": "train", "path": "synthetic_reasoning_azr_natural_training.csv"}, {"split": "test", "path": "synthetic_reasoning_azr_natural.csv"}]}]}
|
2023-09-19T01:35:59+00:00
|
[] |
[
"vi"
] |
TAGS
#task_categories-text2text-generation #language-Vietnamese #license-cc-by-nc-sa-4.0 #region-us
|
# Synthetic reasoning dataset
Original version:
- URL
Translation source code: URL
|
[
"# Synthetic reasoning dataset\n\nOriginal version:\n- URL\n\nTranslation source code: URL"
] |
[
"TAGS\n#task_categories-text2text-generation #language-Vietnamese #license-cc-by-nc-sa-4.0 #region-us \n",
"# Synthetic reasoning dataset\n\nOriginal version:\n- URL\n\nTranslation source code: URL"
] |
[
39,
18
] |
[
"passage: TAGS\n#task_categories-text2text-generation #language-Vietnamese #license-cc-by-nc-sa-4.0 #region-us \n# Synthetic reasoning dataset\n\nOriginal version:\n- URL\n\nTranslation source code: URL"
] |
1389cf580b113140e8d6bbf246ffa2d9219e0e2c
|
# Dataset Card for "c4_counterfactual_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
zxvix/c4_counterfactual_3
|
[
"region:us"
] |
2023-09-19T01:39:04+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}, {"name": "timestamp", "dtype": "timestamp[s]"}, {"name": "url", "dtype": "string"}, {"name": "original_text", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 3490614.435, "num_examples": 985}], "download_size": 2246810, "dataset_size": 3490614.435}}
|
2023-09-19T02:38:25+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "c4_counterfactual_3"
More Information needed
|
[
"# Dataset Card for \"c4_counterfactual_3\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"c4_counterfactual_3\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"c4_counterfactual_3\"\n\nMore Information needed"
] |
77beb3a8c90342f9b8b1196576f6fa6821b92101
|
# Dataset Card for "clean_notebooks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
vikp/clean_notebooks
|
[
"region:us"
] |
2023-09-19T02:10:59+00:00
|
{"dataset_info": {"features": [{"name": "code", "dtype": "string"}, {"name": "kind", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9206821200.476824, "num_examples": 1011857}], "download_size": 5400580201, "dataset_size": 9206821200.476824}}
|
2023-09-19T03:13:51+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "clean_notebooks"
More Information needed
|
[
"# Dataset Card for \"clean_notebooks\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"clean_notebooks\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"clean_notebooks\"\n\nMore Information needed"
] |
5c91a545065550c78451666352913277b6d4ac29
|
# Dataset Card for "TextNormSplit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
DataStudio/TextNormSplitting
|
[
"region:us"
] |
2023-09-19T02:15:36+00:00
|
{"dataset_info": {"features": [{"name": "speak", "dtype": "string"}, {"name": "write", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2887027702, "num_examples": 848161}], "download_size": 1453821474, "dataset_size": 2887027702}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-20T03:34:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "TextNormSplit"
More Information needed
|
[
"# Dataset Card for \"TextNormSplit\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"TextNormSplit\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"TextNormSplit\"\n\nMore Information needed"
] |
69813243bac35d0050a736b3aa25a92bec30a8a6
|
# Domain Adaptation of Large Language Models
This repo contains the **evaluation datasets** for our **ICLR 2024** paper [Adapting Large Language Models via Reading Comprehension](https://huggingface.co/papers/2309.09530).
We explore **continued pre-training on domain-specific corpora** for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to **transform large-scale pre-training corpora into reading comprehension texts**, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. **Our 7B model competes with much larger domain-specific models like BloombergGPT-50B**.
### 🤗 We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned! 🤗
**************************** **Updates** ****************************
* 2024/1/16: 🎉 Our [research paper](https://huggingface.co/papers/2309.09530) has been accepted by ICLR 2024!!!🎉
* 2023/12/19: Released our [13B base models](https://huggingface.co/AdaptLLM/law-LLM-13B) developed from LLaMA-1-13B.
* 2023/12/8: Released our [chat models](https://huggingface.co/AdaptLLM/law-chat) developed from LLaMA-2-Chat-7B.
* 2023/9/18: Released our [paper](https://huggingface.co/papers/2309.09530), [code](https://github.com/microsoft/LMOps), [data](https://huggingface.co/datasets/AdaptLLM/law-tasks), and [base models](https://huggingface.co/AdaptLLM/law-LLM) developed from LLaMA-1-7B.
## Domain-Specific LLaMA-1
### LLaMA-1-7B
In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: [Biomedicine-LLM](https://huggingface.co/AdaptLLM/medicine-LLM), [Finance-LLM](https://huggingface.co/AdaptLLM/finance-LLM) and [Law-LLM](https://huggingface.co/AdaptLLM/law-LLM), the performances of our AdaptLLM compared to other domain-specific LLMs are:
<p align='center'>
<img src="https://cdn-uploads.huggingface.co/production/uploads/650801ced5578ef7e20b33d4/6efPwitFgy-pLTzvccdcP.png" width="700">
</p>
### LLaMA-1-13B
Moreover, we scale up our base model to LLaMA-1-13B to see if **our method is similarly effective for larger-scale models**, and the results are consistently positive too: [Biomedicine-LLM-13B](https://huggingface.co/AdaptLLM/medicine-LLM-13B), [Finance-LLM-13B](https://huggingface.co/AdaptLLM/finance-LLM-13B) and [Law-LLM-13B](https://huggingface.co/AdaptLLM/law-LLM-13B).
## Domain-Specific LLaMA-2-Chat
Our method is also effective for aligned models! LLaMA-2-Chat requires a [specific data format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2), and our **reading comprehension can perfectly fit the data format** by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: [Biomedicine-Chat](https://huggingface.co/AdaptLLM/medicine-chat), [Finance-Chat](https://huggingface.co/AdaptLLM/finance-chat) and [Law-Chat](https://huggingface.co/AdaptLLM/law-chat)
For example, to chat with the finance-chat model:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("AdaptLLM/finance-chat")
tokenizer = AutoTokenizer.from_pretrained("AdaptLLM/finance-chat")
# Put your input here:
user_input = '''Use this fact to answer the question: Title of each class Trading Symbol(s) Name of each exchange on which registered
Common Stock, Par Value $.01 Per Share MMM New York Stock Exchange
MMM Chicago Stock Exchange, Inc.
1.500% Notes due 2026 MMM26 New York Stock Exchange
1.750% Notes due 2030 MMM30 New York Stock Exchange
1.500% Notes due 2031 MMM31 New York Stock Exchange
Which debt securities are registered to trade on a national securities exchange under 3M's name as of Q2 of 2023?'''
# Apply the prompt template and system prompt of LLaMA-2-Chat demo for chat models (NOTE: NO prompt template is required for base models!)
our_system_prompt = "\nYou are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.\n\nIf a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.\n" # Please do NOT change this
prompt = f"<s>[INST] <<SYS>>{our_system_prompt}<</SYS>>\n\n{user_input} [/INST]"
# # NOTE:
# # If you want to apply your own system prompt, please integrate it into the instruction part following our system prompt like this:
# your_system_prompt = "Please, check if the answer can be inferred from the pieces of context provided."
# prompt = f"<s>[INST] <<SYS>>{our_system_prompt}<</SYS>>\n\n{your_system_prompt}\n{user_input} [/INST]"
inputs = tokenizer(prompt, return_tensors="pt", add_special_tokens=False).input_ids.to(model.device)
outputs = model.generate(input_ids=inputs, max_length=4096)[0]
answer_start = int(inputs.shape[-1])
pred = tokenizer.decode(outputs[answer_start:], skip_special_tokens=True)
print(f'### User Input:\n{user_input}\n\n### Assistant Output:\n{pred}')
```
## Domain-Specific Tasks
To easily reproduce our results, we have uploaded the filled-in zero/few-shot input instructions and output completions of each domain-specific task: [biomedicine-tasks](https://huggingface.co/datasets/AdaptLLM/medicine-tasks), [finance-tasks](https://huggingface.co/datasets/AdaptLLM/finance-tasks), and [law-tasks](https://huggingface.co/datasets/AdaptLLM/law-tasks).
**Note:** those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
## Citation
If you find our work helpful, please cite us:
```bibtex
@article{adaptllm,
title={Adapting large language models via reading comprehension},
author={Cheng, Daixuan and Huang, Shaohan and Wei, Furu},
journal={arXiv preprint arXiv:2309.09530},
year={2023}
}
```
|
AdaptLLM/finance-tasks
|
[
"task_categories:text-classification",
"task_categories:question-answering",
"task_categories:zero-shot-classification",
"task_categories:conversational",
"language:en",
"finance",
"arxiv:2309.09530",
"region:us"
] |
2023-09-19T02:17:07+00:00
|
{"language": ["en"], "task_categories": ["text-classification", "question-answering", "zero-shot-classification", "conversational"], "configs": [{"config_name": "ConvFinQA", "data_files": [{"split": "test", "path": "ConviFinQA/test.json"}]}, {"config_name": "FiQA_SA", "data_files": [{"split": "test", "path": "FiQA_SA/test.json"}]}, {"config_name": "FPB", "data_files": [{"split": "test", "path": "FPB/test.json"}]}, {"config_name": "Headline", "data_files": [{"split": "test", "path": "Headline/test.json"}]}, {"config_name": "NER", "data_files": [{"split": "test", "path": "NER/test.json"}]}], "tags": ["finance"]}
|
2024-02-07T12:32:11+00:00
|
[
"2309.09530"
] |
[
"en"
] |
TAGS
#task_categories-text-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-conversational #language-English #finance #arxiv-2309.09530 #region-us
|
# Domain Adaptation of Large Language Models
This repo contains the evaluation datasets for our ICLR 2024 paper Adapting Large Language Models via Reading Comprehension.
We explore continued pre-training on domain-specific corpora for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to transform large-scale pre-training corpora into reading comprehension texts, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. Our 7B model competes with much larger domain-specific models like BloombergGPT-50B.
### We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned!
Updates
* 2024/1/16: Our research paper has been accepted by ICLR 2024!!!
* 2023/12/19: Released our 13B base models developed from LLaMA-1-13B.
* 2023/12/8: Released our chat models developed from LLaMA-2-Chat-7B.
* 2023/9/18: Released our paper, code, data, and base models developed from LLaMA-1-7B.
## Domain-Specific LLaMA-1
### LLaMA-1-7B
In our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: Biomedicine-LLM, Finance-LLM and Law-LLM, the performances of our AdaptLLM compared to other domain-specific LLMs are:
<p align='center'>
<img src="URL width="700">
</p>
### LLaMA-1-13B
Moreover, we scale up our base model to LLaMA-1-13B to see if our method is similarly effective for larger-scale models, and the results are consistently positive too: Biomedicine-LLM-13B, Finance-LLM-13B and Law-LLM-13B.
## Domain-Specific LLaMA-2-Chat
Our method is also effective for aligned models! LLaMA-2-Chat requires a specific data format, and our reading comprehension can perfectly fit the data format by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: Biomedicine-Chat, Finance-Chat and Law-Chat
For example, to chat with the finance-chat model:
## Domain-Specific Tasks
To easily reproduce our results, we have uploaded the filled-in zero/few-shot input instructions and output completions of each domain-specific task: biomedicine-tasks, finance-tasks, and law-tasks.
Note: those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.
If you find our work helpful, please cite us:
|
[
"# Domain Adaptation of Large Language Models\nThis repo contains the evaluation datasets for our ICLR 2024 paper Adapting Large Language Models via Reading Comprehension.\n\nWe explore continued pre-training on domain-specific corpora for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to transform large-scale pre-training corpora into reading comprehension texts, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. Our 7B model competes with much larger domain-specific models like BloombergGPT-50B.",
"### We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned! \n\n Updates \n* 2024/1/16: Our research paper has been accepted by ICLR 2024!!!\n* 2023/12/19: Released our 13B base models developed from LLaMA-1-13B.\n* 2023/12/8: Released our chat models developed from LLaMA-2-Chat-7B.\n* 2023/9/18: Released our paper, code, data, and base models developed from LLaMA-1-7B.",
"## Domain-Specific LLaMA-1",
"### LLaMA-1-7B\nIn our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: Biomedicine-LLM, Finance-LLM and Law-LLM, the performances of our AdaptLLM compared to other domain-specific LLMs are:\n\n<p align='center'>\n <img src=\"URL width=\"700\">\n</p>",
"### LLaMA-1-13B\nMoreover, we scale up our base model to LLaMA-1-13B to see if our method is similarly effective for larger-scale models, and the results are consistently positive too: Biomedicine-LLM-13B, Finance-LLM-13B and Law-LLM-13B.",
"## Domain-Specific LLaMA-2-Chat\nOur method is also effective for aligned models! LLaMA-2-Chat requires a specific data format, and our reading comprehension can perfectly fit the data format by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: Biomedicine-Chat, Finance-Chat and Law-Chat\n\nFor example, to chat with the finance-chat model:",
"## Domain-Specific Tasks\nTo easily reproduce our results, we have uploaded the filled-in zero/few-shot input instructions and output completions of each domain-specific task: biomedicine-tasks, finance-tasks, and law-tasks.\n\nNote: those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.\n\nIf you find our work helpful, please cite us:"
] |
[
"TAGS\n#task_categories-text-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-conversational #language-English #finance #arxiv-2309.09530 #region-us \n",
"# Domain Adaptation of Large Language Models\nThis repo contains the evaluation datasets for our ICLR 2024 paper Adapting Large Language Models via Reading Comprehension.\n\nWe explore continued pre-training on domain-specific corpora for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to transform large-scale pre-training corpora into reading comprehension texts, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. Our 7B model competes with much larger domain-specific models like BloombergGPT-50B.",
"### We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned! \n\n Updates \n* 2024/1/16: Our research paper has been accepted by ICLR 2024!!!\n* 2023/12/19: Released our 13B base models developed from LLaMA-1-13B.\n* 2023/12/8: Released our chat models developed from LLaMA-2-Chat-7B.\n* 2023/9/18: Released our paper, code, data, and base models developed from LLaMA-1-7B.",
"## Domain-Specific LLaMA-1",
"### LLaMA-1-7B\nIn our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: Biomedicine-LLM, Finance-LLM and Law-LLM, the performances of our AdaptLLM compared to other domain-specific LLMs are:\n\n<p align='center'>\n <img src=\"URL width=\"700\">\n</p>",
"### LLaMA-1-13B\nMoreover, we scale up our base model to LLaMA-1-13B to see if our method is similarly effective for larger-scale models, and the results are consistently positive too: Biomedicine-LLM-13B, Finance-LLM-13B and Law-LLM-13B.",
"## Domain-Specific LLaMA-2-Chat\nOur method is also effective for aligned models! LLaMA-2-Chat requires a specific data format, and our reading comprehension can perfectly fit the data format by transforming the reading comprehension into a multi-turn conversation. We have also open-sourced chat models in different domains: Biomedicine-Chat, Finance-Chat and Law-Chat\n\nFor example, to chat with the finance-chat model:",
"## Domain-Specific Tasks\nTo easily reproduce our results, we have uploaded the filled-in zero/few-shot input instructions and output completions of each domain-specific task: biomedicine-tasks, finance-tasks, and law-tasks.\n\nNote: those filled-in instructions are specifically tailored for models before alignment and do NOT fit for the specific data format required for chat models.\n\nIf you find our work helpful, please cite us:"
] |
[
68,
153,
116,
10,
97,
72,
101,
103
] |
[
"passage: TAGS\n#task_categories-text-classification #task_categories-question-answering #task_categories-zero-shot-classification #task_categories-conversational #language-English #finance #arxiv-2309.09530 #region-us \n# Domain Adaptation of Large Language Models\nThis repo contains the evaluation datasets for our ICLR 2024 paper Adapting Large Language Models via Reading Comprehension.\n\nWe explore continued pre-training on domain-specific corpora for large language models. While this approach enriches LLMs with domain knowledge, it significantly hurts their prompting ability for question answering. Inspired by human learning via reading comprehension, we propose a simple method to transform large-scale pre-training corpora into reading comprehension texts, consistently improving prompting performance across tasks in biomedicine, finance, and law domains. Our 7B model competes with much larger domain-specific models like BloombergGPT-50B.### We are currently working hard on developing models across different domains, scales and architectures! Please stay tuned! \n\n Updates \n* 2024/1/16: Our research paper has been accepted by ICLR 2024!!!\n* 2023/12/19: Released our 13B base models developed from LLaMA-1-13B.\n* 2023/12/8: Released our chat models developed from LLaMA-2-Chat-7B.\n* 2023/9/18: Released our paper, code, data, and base models developed from LLaMA-1-7B.## Domain-Specific LLaMA-1### LLaMA-1-7B\nIn our paper, we develop three domain-specific models from LLaMA-1-7B, which are also available in Huggingface: Biomedicine-LLM, Finance-LLM and Law-LLM, the performances of our AdaptLLM compared to other domain-specific LLMs are:\n\n<p align='center'>\n <img src=\"URL width=\"700\">\n</p>"
] |
437c50666fbfb0d8065ca93137843d184576a1f1
|
# Dataset Card for "alpaca-2-13B-chinese-couplet-val-4k-predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
chenqile09/alpaca-2-13B-chinese-couplet-val-4k-predictions
|
[
"region:us"
] |
2023-09-19T02:25:33+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "data", "path": "data/data-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "prediction", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "data", "num_bytes": 386019, "num_examples": 4000}], "download_size": 341079, "dataset_size": 386019}}
|
2023-09-19T02:25:35+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "alpaca-2-13B-chinese-couplet-val-4k-predictions"
More Information needed
|
[
"# Dataset Card for \"alpaca-2-13B-chinese-couplet-val-4k-predictions\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"alpaca-2-13B-chinese-couplet-val-4k-predictions\"\n\nMore Information needed"
] |
[
6,
29
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"alpaca-2-13B-chinese-couplet-val-4k-predictions\"\n\nMore Information needed"
] |
22335a83294726bcb671acbcfb6ed5faabc69c38
|
# Dataset Card for "alpaca-2-7B-chinese-couplet-val-4k-predictions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
chenqile09/alpaca-2-7B-chinese-couplet-val-4k-predictions
|
[
"region:us"
] |
2023-09-19T02:45:30+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "data", "path": "data/data-*"}]}], "dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "prediction", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "data", "num_bytes": 386155, "num_examples": 4000}], "download_size": 342185, "dataset_size": 386155}}
|
2023-09-19T02:45:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "alpaca-2-7B-chinese-couplet-val-4k-predictions"
More Information needed
|
[
"# Dataset Card for \"alpaca-2-7B-chinese-couplet-val-4k-predictions\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"alpaca-2-7B-chinese-couplet-val-4k-predictions\"\n\nMore Information needed"
] |
[
6,
29
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"alpaca-2-7B-chinese-couplet-val-4k-predictions\"\n\nMore Information needed"
] |
f14dfe34c9771abd6118e9ff4becfa77e110cd80
|
This data set is a lightweight fine-tuned data format version of the Llama2 large language model for Stanford Alpaca. You can click [here](https://www.runoob.com) to view.
cite original code
```
@inproceedings{cohan-etal-2018-discourse,
title = "A Discourse-Aware Attention Model for Abstractive Summarization of Long Documents",
author = "Cohan, Arman and
Dernoncourt, Franck and
Kim, Doo Soon and
Bui, Trung and
Kim, Seokhwan and
Chang, Walter and
Goharian, Nazli",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2097",
doi = "10.18653/v1/N18-2097",
pages = "615--621",
abstract = "Neural abstractive summarization models have led to promising results in summarizing relatively short documents. We propose the first model for abstractive summarization of single, longer-form documents (e.g., research papers). Our approach consists of a new hierarchical encoder that models the discourse structure of a document, and an attentive discourse-aware decoder to generate the summary. Empirical results on two large-scale datasets of scientific papers show that our model significantly outperforms state-of-the-art models.",
}
```
|
ZhongshengWang/Alpaca-pubmed-summarization
|
[
"task_categories:summarization",
"task_categories:text-generation",
"multilinguality:monolingual",
"size_categories:100K<n<1M",
"language:en",
"license:openrail",
"conditional-text-generation",
"region:us"
] |
2023-09-19T03:43:13+00:00
|
{"language": ["en"], "license": "openrail", "multilinguality": ["monolingual"], "size_categories": ["100K<n<1M"], "task_categories": ["summarization", "text-generation"], "tags": ["conditional-text-generation"]}
|
2023-09-19T04:47:25+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-summarization #task_categories-text-generation #multilinguality-monolingual #size_categories-100K<n<1M #language-English #license-openrail #conditional-text-generation #region-us
|
This data set is a lightweight fine-tuned data format version of the Llama2 large language model for Stanford Alpaca. You can click here to view.
cite original code
|
[] |
[
"TAGS\n#task_categories-summarization #task_categories-text-generation #multilinguality-monolingual #size_categories-100K<n<1M #language-English #license-openrail #conditional-text-generation #region-us \n"
] |
[
65
] |
[
"passage: TAGS\n#task_categories-summarization #task_categories-text-generation #multilinguality-monolingual #size_categories-100K<n<1M #language-English #license-openrail #conditional-text-generation #region-us \n"
] |
29520ebc8c6e6ed9798b47a40f394c717e7ee05b
|
# Dataset Card for HAR
A tabular dataset which poses the task of prediction human activity based on smartphone sensor signal (accelerometer and gyroscope).
## Dataset Details
### Dataset Description
*Summary from https://archive.ics.uci.edu/dataset/240/human+activity+recognition+using+smartphones:*
The experiments were carried out with a group of 30 volunteers within an age bracket of 19-48 years. They performed a protocol of activities composed of six basic activities: three static postures (standing, sitting, lying) and three dynamic activities (walking, walking downstairs and walking upstairs). The experiment also included postural transitions that occurred between the static postures. These are: stand-to-sit, sit-to-stand, sit-to-lie, lie-to-sit, stand-to-lie, and lie-to-stand. All the participants were wearing a smartphone (Samsung Galaxy S II) on the waist during the experiment execution. We captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz using the embedded accelerometer and gyroscope of the device. The experiments were video-recorded to label the data manually. The obtained dataset was randomly partitioned into two sets, where 70% of the volunteers was selected for generating the training data and 30% the test data.
The sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window). The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity. The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used. From each window, a vector of 561 features was obtained by calculating variables from the time and frequency domain. See 'features_info.txt' for more details.
This dataset is an updated version of the UCI Human Activity Recognition Using smartphones Dataset that can be found at: https://archive.ics.uci.edu/ml/datasets/Human+Activity+Recognition+Using+Smartphones
This version provides the original raw inertial signals from the smartphone sensors, instead of the ones pre-processed into windows which were provided in version 1. This change was done in order to be able to make online tests with the raw data. Moreover, the activity labels were updated in order to include postural transitions that were not part of the previous version of the dataset.
- **Curated by:** Reyes-Ortiz, Jorge, Anguita, Davide, Ghio, Alessandro, Oneto, Luca, and Parra, Xavier
- **License:** This dataset is licensed under a [Creative Commons Attribution 4.0 International (CC BY 4.0)](https://creativecommons.org/licenses/by/4.0/legalcode) license.
### Dataset Sources
- **Repository:** http://archive.ics.uci.edu/dataset/341/smartphone+based+recognition+of+human+activities+and+postural+transitions
- **Paper:** https://www.sciencedirect.com/science/article/abs/pii/S0925231215010930
- **Experiment Demo:** http://www.youtube.com/watch?v=XOEN9W05_4A
## Citation
**BibTeX:**
@misc{misc_smartphone-based_recognition_of_human_activities_and_postural_transitions_341,
author = {Reyes-Ortiz,Jorge, Anguita,Davide, Oneto,Luca, and Parra,Xavier},
title = {{Smartphone-Based Recognition of Human Activities and Postural Transitions}},
year = {2015},
howpublished = {UCI Machine Learning Repository},
note = {{DOI}: https://doi.org/10.24432/C54G7M}
}
**APA:**
Reyes-Ortiz, Jorge, Anguita, Davide, Oneto, Luca, and Parra, Xavier. (2015). Smartphone-Based Recognition of Human Activities and Postural Transitions. UCI Machine Learning Repository. https://doi.org/10.24432/C54G7M.
|
codymlewis/HAR
|
[
"size_categories:n<1K",
"license:cc-by-4.0",
"region:us"
] |
2023-09-19T04:19:13+00:00
|
{"license": "cc-by-4.0", "size_categories": ["n<1K"], "pretty_name": "HAR", "dataset_info": {"features": [{"name": "features", "sequence": "float32", "length": 561}, {"name": "labels", "dtype": {"class_label": {"names": {"0": "WALKING", "1": "WALKING_UPSTAIRS", "2": "WALKING_DOWNSTAIRS", "3": "SITTING", "4": "STANDING", "5": "LAYING", "6": "STAND_TO_SIT", "7": "SIT_TO_STAND", "8": "SIT_TO_LIE", "9": "LIE_TO_SIT", "10": "STAND_TO_LIE", "11": "LIE_TO_STAND"}}}}, {"name": "subject id", "dtype": "uint8"}], "splits": [{"name": "train", "num_bytes": 17499051, "num_examples": 7767}, {"name": "test", "num_bytes": 7123986, "num_examples": 3162}], "download_size": 79596192, "dataset_size": 24623037}}
|
2023-10-13T02:23:34+00:00
|
[] |
[] |
TAGS
#size_categories-n<1K #license-cc-by-4.0 #region-us
|
# Dataset Card for HAR
A tabular dataset which poses the task of prediction human activity based on smartphone sensor signal (accelerometer and gyroscope).
## Dataset Details
### Dataset Description
*Summary from URL
The experiments were carried out with a group of 30 volunteers within an age bracket of 19-48 years. They performed a protocol of activities composed of six basic activities: three static postures (standing, sitting, lying) and three dynamic activities (walking, walking downstairs and walking upstairs). The experiment also included postural transitions that occurred between the static postures. These are: stand-to-sit, sit-to-stand, sit-to-lie, lie-to-sit, stand-to-lie, and lie-to-stand. All the participants were wearing a smartphone (Samsung Galaxy S II) on the waist during the experiment execution. We captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz using the embedded accelerometer and gyroscope of the device. The experiments were video-recorded to label the data manually. The obtained dataset was randomly partitioned into two sets, where 70% of the volunteers was selected for generating the training data and 30% the test data.
The sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window). The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity. The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used. From each window, a vector of 561 features was obtained by calculating variables from the time and frequency domain. See 'features_info.txt' for more details.
This dataset is an updated version of the UCI Human Activity Recognition Using smartphones Dataset that can be found at: URL
This version provides the original raw inertial signals from the smartphone sensors, instead of the ones pre-processed into windows which were provided in version 1. This change was done in order to be able to make online tests with the raw data. Moreover, the activity labels were updated in order to include postural transitions that were not part of the previous version of the dataset.
- Curated by: Reyes-Ortiz, Jorge, Anguita, Davide, Ghio, Alessandro, Oneto, Luca, and Parra, Xavier
- License: This dataset is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) license.
### Dataset Sources
- Repository: URL
- Paper: URL
- Experiment Demo: URL
BibTeX:
@misc{misc_smartphone-based_recognition_of_human_activities_and_postural_transitions_341,
author = {Reyes-Ortiz,Jorge, Anguita,Davide, Oneto,Luca, and Parra,Xavier},
title = {{Smartphone-Based Recognition of Human Activities and Postural Transitions}},
year = {2015},
howpublished = {UCI Machine Learning Repository},
note = {{DOI}: URL
}
APA:
Reyes-Ortiz, Jorge, Anguita, Davide, Oneto, Luca, and Parra, Xavier. (2015). Smartphone-Based Recognition of Human Activities and Postural Transitions. UCI Machine Learning Repository. URL
|
[
"# Dataset Card for HAR\n\nA tabular dataset which poses the task of prediction human activity based on smartphone sensor signal (accelerometer and gyroscope).",
"## Dataset Details",
"### Dataset Description\n\n*Summary from URL\nThe experiments were carried out with a group of 30 volunteers within an age bracket of 19-48 years. They performed a protocol of activities composed of six basic activities: three static postures (standing, sitting, lying) and three dynamic activities (walking, walking downstairs and walking upstairs). The experiment also included postural transitions that occurred between the static postures. These are: stand-to-sit, sit-to-stand, sit-to-lie, lie-to-sit, stand-to-lie, and lie-to-stand. All the participants were wearing a smartphone (Samsung Galaxy S II) on the waist during the experiment execution. We captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz using the embedded accelerometer and gyroscope of the device. The experiments were video-recorded to label the data manually. The obtained dataset was randomly partitioned into two sets, where 70% of the volunteers was selected for generating the training data and 30% the test data. \n\nThe sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window). The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity. The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used. From each window, a vector of 561 features was obtained by calculating variables from the time and frequency domain. See 'features_info.txt' for more details. \n\nThis dataset is an updated version of the UCI Human Activity Recognition Using smartphones Dataset that can be found at: URL\n\nThis version provides the original raw inertial signals from the smartphone sensors, instead of the ones pre-processed into windows which were provided in version 1. This change was done in order to be able to make online tests with the raw data. Moreover, the activity labels were updated in order to include postural transitions that were not part of the previous version of the dataset. \n\n\n\n- Curated by: Reyes-Ortiz, Jorge, Anguita, Davide, Ghio, Alessandro, Oneto, Luca, and Parra, Xavier\n- License: This dataset is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) license.",
"### Dataset Sources\n\n- Repository: URL\n- Paper: URL\n- Experiment Demo: URL\n\n\nBibTeX:\n\n@misc{misc_smartphone-based_recognition_of_human_activities_and_postural_transitions_341,\n author = {Reyes-Ortiz,Jorge, Anguita,Davide, Oneto,Luca, and Parra,Xavier},\n title = {{Smartphone-Based Recognition of Human Activities and Postural Transitions}},\n year = {2015},\n howpublished = {UCI Machine Learning Repository},\n note = {{DOI}: URL\n}\n\n\nAPA:\n\nReyes-Ortiz, Jorge, Anguita, Davide, Oneto, Luca, and Parra, Xavier. (2015). Smartphone-Based Recognition of Human Activities and Postural Transitions. UCI Machine Learning Repository. URL"
] |
[
"TAGS\n#size_categories-n<1K #license-cc-by-4.0 #region-us \n",
"# Dataset Card for HAR\n\nA tabular dataset which poses the task of prediction human activity based on smartphone sensor signal (accelerometer and gyroscope).",
"## Dataset Details",
"### Dataset Description\n\n*Summary from URL\nThe experiments were carried out with a group of 30 volunteers within an age bracket of 19-48 years. They performed a protocol of activities composed of six basic activities: three static postures (standing, sitting, lying) and three dynamic activities (walking, walking downstairs and walking upstairs). The experiment also included postural transitions that occurred between the static postures. These are: stand-to-sit, sit-to-stand, sit-to-lie, lie-to-sit, stand-to-lie, and lie-to-stand. All the participants were wearing a smartphone (Samsung Galaxy S II) on the waist during the experiment execution. We captured 3-axial linear acceleration and 3-axial angular velocity at a constant rate of 50Hz using the embedded accelerometer and gyroscope of the device. The experiments were video-recorded to label the data manually. The obtained dataset was randomly partitioned into two sets, where 70% of the volunteers was selected for generating the training data and 30% the test data. \n\nThe sensor signals (accelerometer and gyroscope) were pre-processed by applying noise filters and then sampled in fixed-width sliding windows of 2.56 sec and 50% overlap (128 readings/window). The sensor acceleration signal, which has gravitational and body motion components, was separated using a Butterworth low-pass filter into body acceleration and gravity. The gravitational force is assumed to have only low frequency components, therefore a filter with 0.3 Hz cutoff frequency was used. From each window, a vector of 561 features was obtained by calculating variables from the time and frequency domain. See 'features_info.txt' for more details. \n\nThis dataset is an updated version of the UCI Human Activity Recognition Using smartphones Dataset that can be found at: URL\n\nThis version provides the original raw inertial signals from the smartphone sensors, instead of the ones pre-processed into windows which were provided in version 1. This change was done in order to be able to make online tests with the raw data. Moreover, the activity labels were updated in order to include postural transitions that were not part of the previous version of the dataset. \n\n\n\n- Curated by: Reyes-Ortiz, Jorge, Anguita, Davide, Ghio, Alessandro, Oneto, Luca, and Parra, Xavier\n- License: This dataset is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) license.",
"### Dataset Sources\n\n- Repository: URL\n- Paper: URL\n- Experiment Demo: URL\n\n\nBibTeX:\n\n@misc{misc_smartphone-based_recognition_of_human_activities_and_postural_transitions_341,\n author = {Reyes-Ortiz,Jorge, Anguita,Davide, Oneto,Luca, and Parra,Xavier},\n title = {{Smartphone-Based Recognition of Human Activities and Postural Transitions}},\n year = {2015},\n howpublished = {UCI Machine Learning Repository},\n note = {{DOI}: URL\n}\n\n\nAPA:\n\nReyes-Ortiz, Jorge, Anguita, Davide, Oneto, Luca, and Parra, Xavier. (2015). Smartphone-Based Recognition of Human Activities and Postural Transitions. UCI Machine Learning Repository. URL"
] |
[
25,
34,
4,
582,
202
] |
[
"passage: TAGS\n#size_categories-n<1K #license-cc-by-4.0 #region-us \n# Dataset Card for HAR\n\nA tabular dataset which poses the task of prediction human activity based on smartphone sensor signal (accelerometer and gyroscope).## Dataset Details"
] |
cee22d4efa0b8ee68dbd72ec581459824541018e
|
# Dataset Card for "squad_baseline_train_10_eval_10"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
tyzhu/squad_baseline_train_10_eval_10
|
[
"region:us"
] |
2023-09-19T04:29:51+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "answers", "sequence": [{"name": "text", "dtype": "string"}, {"name": "answer_start", "dtype": "int32"}]}, {"name": "context_id", "dtype": "string"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 52389, "num_examples": 51}, {"name": "validation", "num_bytes": 58313, "num_examples": 48}], "download_size": 0, "dataset_size": 110702}}
|
2023-09-19T04:30:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "squad_baseline_train_10_eval_10"
More Information needed
|
[
"# Dataset Card for \"squad_baseline_train_10_eval_10\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_baseline_train_10_eval_10\"\n\nMore Information needed"
] |
[
6,
25
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_baseline_train_10_eval_10\"\n\nMore Information needed"
] |
07c07206ad0ea27ea316d55975a9f9fc18f9bdb3
|
# Dataset Card for "data_aug_cua"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
linhqyy/data_aug_cua
|
[
"region:us"
] |
2023-09-19T04:34:45+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "sentence", "dtype": "string"}, {"name": "intent", "dtype": "string"}, {"name": "entities", "list": [{"name": "type", "dtype": "string"}, {"name": "filler", "dtype": "string"}]}, {"name": "labels", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2162586, "num_examples": 9941}, {"name": "test", "num_bytes": 241472, "num_examples": 1105}], "download_size": 596926, "dataset_size": 2404058}}
|
2023-09-19T04:34:48+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "data_aug_cua"
More Information needed
|
[
"# Dataset Card for \"data_aug_cua\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"data_aug_cua\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"data_aug_cua\"\n\nMore Information needed"
] |
a2e08cf71735212c2c6acc76af30f638f1b05041
|
# Dataset Card for "orca-evaluated-falcon-gpt4-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
sachith-surge/orca-evaluated-falcon-gpt4-v2
|
[
"region:us"
] |
2023-09-19T04:43:39+00:00
|
{"dataset_info": {"features": [{"name": "original_index", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "task_source", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "template_type", "dtype": "string"}, {"name": "system_message", "dtype": "string"}, {"name": "explained_targets", "dtype": "string"}, {"name": "dataset_source", "dtype": "string"}, {"name": "falcon_status", "dtype": "string"}, {"name": "falcon_rating", "dtype": "string"}, {"name": "falcon_reason", "dtype": "string"}, {"name": "gpt4_status", "dtype": "string"}, {"name": "gpt4_rating", "dtype": "string"}, {"name": "gpt4_reason", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6521750, "num_examples": 3517}], "download_size": 3081179, "dataset_size": 6521750}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-19T04:43:42+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "orca-evaluated-falcon-gpt4-v2"
More Information needed
|
[
"# Dataset Card for \"orca-evaluated-falcon-gpt4-v2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"orca-evaluated-falcon-gpt4-v2\"\n\nMore Information needed"
] |
[
6,
24
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"orca-evaluated-falcon-gpt4-v2\"\n\nMore Information needed"
] |
4a22ee02bf8c16b3841e4e9d424517d8e60e208d
|
# Dataset Card for "orca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
sachith-surge/orca
|
[
"region:us"
] |
2023-09-19T04:46:10+00:00
|
{"dataset_info": {"features": [{"name": "original_index", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "task_source", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "template_type", "dtype": "string"}, {"name": "system_message", "dtype": "string"}, {"name": "explained_targets", "dtype": "string"}, {"name": "dataset_source", "dtype": "string"}, {"name": "falcon_status", "dtype": "string"}, {"name": "falcon_rating", "dtype": "string"}, {"name": "falcon_reason", "dtype": "string"}, {"name": "gpt4_status", "dtype": "string"}, {"name": "gpt4_rating", "dtype": "string"}, {"name": "gpt4_reason", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10761181, "num_examples": 5517}], "download_size": 5035931, "dataset_size": 10761181}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-19T04:46:13+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "orca"
More Information needed
|
[
"# Dataset Card for \"orca\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"orca\"\n\nMore Information needed"
] |
[
6,
12
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"orca\"\n\nMore Information needed"
] |
2eed0f63422f8aaff0ad35bbab11d69662251b34
|
# Bangumi Image Base of Jashin-chan Dropkick
This is the image base of bangumi Jashin-chan Dropkick, we detected 44 characters, 6043 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1710 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 41 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 44 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 29 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 41 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 32 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 20 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 14 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 210 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 292 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 76 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 30 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 398 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 14 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 14 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 62 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 84 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 20 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 21 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 27 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 11 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 17 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 143 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 10 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 8 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 42 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 217 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 11 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 555 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 534 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 11 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 32 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 24 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 211 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 160 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 22 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 23 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 5 | [Download](37/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 38 | 355 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 33 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 7 | [Download](40/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 41 | 9 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 5 | [Download](42/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| noise | 419 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
|
BangumiBase/jashinchandropkick
|
[
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] |
2023-09-19T04:49:55+00:00
|
{"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]}
|
2023-09-29T08:32:28+00:00
|
[] |
[] |
TAGS
#size_categories-1K<n<10K #license-mit #art #region-us
|
Bangumi Image Base of Jashin-chan Dropkick
==========================================
This is the image base of bangumi Jashin-chan Dropkick, we detected 44 characters, 6043 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
|
[] |
[
"TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
] |
[
25
] |
[
"passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.