sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
listlengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
listlengths 0
25
| languages
listlengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
listlengths 0
352
| processed_texts
listlengths 1
353
| tokens_length
listlengths 1
353
| input_texts
listlengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ee79b5f871125fe3f6974d821d48bc53bd63a2fc
|
# Dataset Card for "flickr30k_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Tristan/flickr30k_test
|
[
"region:us"
] |
2023-09-04T21:34:11+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "caption", "list": "string"}, {"name": "sentids", "list": "string"}, {"name": "split", "dtype": "string"}, {"name": "img_id", "dtype": "string"}, {"name": "filename", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 142117238.54065907, "num_examples": 1000}], "download_size": 141466584, "dataset_size": 142117238.54065907}}
|
2023-09-04T21:36:06+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "flickr30k_test"
More Information needed
|
[
"# Dataset Card for \"flickr30k_test\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"flickr30k_test\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"flickr30k_test\"\n\nMore Information needed"
] |
e37fb6b9ccefb95009103f50db0c9e0bba9d4308
|
# Dataset Card for Evaluation run of synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M](https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T08:39:00.771555](https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M/blob/main/results_2023-09-23T08-39-00.771555.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.2627936241610738,
"em_stderr": 0.004507560917898865,
"f1": 0.30115981543624176,
"f1_stderr": 0.004494140287139199,
"acc": 0.3666975232366727,
"acc_stderr": 0.008004674480789642
},
"harness|drop|3": {
"em": 0.2627936241610738,
"em_stderr": 0.004507560917898865,
"f1": 0.30115981543624176,
"f1_stderr": 0.004494140287139199
},
"harness|gsm8k|5": {
"acc": 0.015163002274450341,
"acc_stderr": 0.003366022949726345
},
"harness|winogrande|5": {
"acc": 0.7182320441988951,
"acc_stderr": 0.01264332601185294
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M
|
[
"region:us"
] |
2023-09-04T21:46:11+00:00
|
{"pretty_name": "Evaluation run of synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M", "dataset_summary": "Dataset automatically created during the evaluation run of model [synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M](https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-23T08:39:00.771555](https://huggingface.co/datasets/open-llm-leaderboard/details_synapsoft__Llama-2-7b-chat-hf-flan2022-1.2M/blob/main/results_2023-09-23T08-39-00.771555.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.2627936241610738,\n \"em_stderr\": 0.004507560917898865,\n \"f1\": 0.30115981543624176,\n \"f1_stderr\": 0.004494140287139199,\n \"acc\": 0.3666975232366727,\n \"acc_stderr\": 0.008004674480789642\n },\n \"harness|drop|3\": {\n \"em\": 0.2627936241610738,\n \"em_stderr\": 0.004507560917898865,\n \"f1\": 0.30115981543624176,\n \"f1_stderr\": 0.004494140287139199\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.015163002274450341,\n \"acc_stderr\": 0.003366022949726345\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7182320441988951,\n \"acc_stderr\": 0.01264332601185294\n }\n}\n```", "repo_url": "https://huggingface.co/synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|arc:challenge|25_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_23T08_39_00.771555", "path": ["**/details_harness|drop|3_2023-09-23T08-39-00.771555.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-23T08-39-00.771555.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_23T08_39_00.771555", "path": ["**/details_harness|gsm8k|5_2023-09-23T08-39-00.771555.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-23T08-39-00.771555.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hellaswag|10_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T22:45:47.858606.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-04T22:45:47.858606.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_23T08_39_00.771555", "path": ["**/details_harness|winogrande|5_2023-09-23T08-39-00.771555.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-23T08-39-00.771555.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_04T22_45_47.858606", "path": ["results_2023-09-04T22:45:47.858606.parquet"]}, {"split": "2023_09_23T08_39_00.771555", "path": ["results_2023-09-23T08-39-00.771555.parquet"]}, {"split": "latest", "path": ["results_2023-09-23T08-39-00.771555.parquet"]}]}]}
|
2023-09-23T07:39:13+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-23T08:39:00.771555(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T08:39:00.771555(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-23T08:39:00.771555(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
29,
31,
177,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model synapsoft/Llama-2-7b-chat-hf-flan2022-1.2M on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-23T08:39:00.771555(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
cf4ab67176ccf19f7b22dbe01e9508208c572d64
|
# Dataset Card for "kub_tickets_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
aqubed/kub_tickets_small
|
[
"region:us"
] |
2023-09-04T21:58:08+00:00
|
{"dataset_info": {"features": [{"name": "number", "dtype": "int64"}, {"name": "title", "dtype": "string"}, {"name": "state", "dtype": "string"}, {"name": "created_at", "dtype": "string"}, {"name": "updated_at", "dtype": "string"}, {"name": "closed_at", "dtype": "string"}, {"name": "assignees", "sequence": "string"}, {"name": "labels", "sequence": "string"}, {"name": "reporter", "dtype": "string"}, {"name": "comments", "list": [{"name": "body", "dtype": "string"}, {"name": "created_at", "dtype": "string"}]}, {"name": "events", "list": [{"name": "author", "dtype": "string"}, {"name": "created_at", "dtype": "string"}, {"name": "type", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 5967498, "num_examples": 1099}], "download_size": 1380020, "dataset_size": 5967498}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-04T22:08:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "kub_tickets_small"
More Information needed
|
[
"# Dataset Card for \"kub_tickets_small\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"kub_tickets_small\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"kub_tickets_small\"\n\nMore Information needed"
] |
42a8ef1bcfa36c82ee73f9622d2d084912ff3c67
|
# Dataset Card for "keywords_daily_dialog"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
edmundtsou/keywords_daily_dialog
|
[
"region:us"
] |
2023-09-04T23:16:59+00:00
|
{"dataset_info": {"features": [{"name": "dialog", "sequence": "string"}, {"name": "ids", "dtype": "int64"}, {"name": "keywords", "sequence": {"sequence": "string"}}], "splits": [{"name": "train", "num_bytes": 10163143, "num_examples": 13118}], "download_size": 5240789, "dataset_size": 10163143}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-04T23:17:00+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "keywords_daily_dialog"
More Information needed
|
[
"# Dataset Card for \"keywords_daily_dialog\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"keywords_daily_dialog\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"keywords_daily_dialog\"\n\nMore Information needed"
] |
b14f6e68038f11a9c0765a8ab67fdf8003593095
|
Translate German to English
|
vhtran/en-de-2023
|
[
"license:cc-by-4.0",
"region:us"
] |
2023-09-04T23:54:47+00:00
|
{"license": "cc-by-4.0"}
|
2023-09-05T00:00:23+00:00
|
[] |
[] |
TAGS
#license-cc-by-4.0 #region-us
|
Translate German to English
|
[] |
[
"TAGS\n#license-cc-by-4.0 #region-us \n"
] |
[
15
] |
[
"passage: TAGS\n#license-cc-by-4.0 #region-us \n"
] |
559c5bdd34f5b3528d22f3bd5b3d9e744c6a7cfd
|
# Dataset Card for "mura_dataset_processed_224px_train_val_with_labels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
KhalfounMehdi/mura_dataset_processed_224px_train_val_with_labels
|
[
"region:us"
] |
2023-09-04T23:56:38+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}, {"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "abnormal", "1": "normal"}}}}, {"name": "labels", "dtype": "int64"}], "splits": [{"name": "test", "num_bytes": 99750354.875, "num_examples": 4001}, {"name": "train", "num_bytes": 897948950.5, "num_examples": 36004}, {"name": "validation", "num_bytes": 99750354.875, "num_examples": 4001}], "download_size": 1097501239, "dataset_size": 1097449660.25}}
|
2023-09-04T23:57:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "mura_dataset_processed_224px_train_val_with_labels"
More Information needed
|
[
"# Dataset Card for \"mura_dataset_processed_224px_train_val_with_labels\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"mura_dataset_processed_224px_train_val_with_labels\"\n\nMore Information needed"
] |
[
6,
31
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"mura_dataset_processed_224px_train_val_with_labels\"\n\nMore Information needed"
] |
5c4c742ef5858694fffd232337e98366c5a56873
|
# Dataset Card for "c8b07d9e"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/c8b07d9e
|
[
"region:us"
] |
2023-09-05T00:04:58+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 184, "num_examples": 10}], "download_size": 1337, "dataset_size": 184}}
|
2023-09-05T00:04:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "c8b07d9e"
More Information needed
|
[
"# Dataset Card for \"c8b07d9e\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"c8b07d9e\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"c8b07d9e\"\n\nMore Information needed"
] |
0429c85f0c2748007bc198aed705ddf5e407a38f
|
# Dataset Card for "products_desc_and_marktng_emails_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
honglinggoh/products_desc_and_marktng_emails_dataset
|
[
"region:us"
] |
2023-09-05T00:15:04+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 25130, "num_examples": 13}], "download_size": 27570, "dataset_size": 25130}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T00:15:09+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "products_desc_and_marktng_emails_dataset"
More Information needed
|
[
"# Dataset Card for \"products_desc_and_marktng_emails_dataset\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"products_desc_and_marktng_emails_dataset\"\n\nMore Information needed"
] |
[
6,
26
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"products_desc_and_marktng_emails_dataset\"\n\nMore Information needed"
] |
64aaa208a5f91a394a158f6c15b3261e49e311c4
|
# Dataset Card for "genai_marketmail_sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
jschew39/genai_marketmail_sample
|
[
"region:us"
] |
2023-09-05T00:20:17+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 15396, "num_examples": 8}], "download_size": 23299, "dataset_size": 15396}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T00:20:19+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "genai_marketmail_sample"
More Information needed
|
[
"# Dataset Card for \"genai_marketmail_sample\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"genai_marketmail_sample\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"genai_marketmail_sample\"\n\nMore Information needed"
] |
672829fb18e6868ec9ceff5e928da9122e363e98
|
# Dataset Card for "dino_marketing_emails"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
sparkyfina/dino_marketing_emails
|
[
"region:us"
] |
2023-09-05T00:24:46+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 37399, "num_examples": 20}], "download_size": 33872, "dataset_size": 37399}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T00:24:48+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "dino_marketing_emails"
More Information needed
|
[
"# Dataset Card for \"dino_marketing_emails\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"dino_marketing_emails\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"dino_marketing_emails\"\n\nMore Information needed"
] |
277ca42fc51dd9ebb25bdaf18283a117223cc1d7
|
# Dataset Card for "fb-workshop-09042023-marketing-email-ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
unmeshk/fb-workshop-09042023-marketing-email-ds
|
[
"region:us"
] |
2023-09-05T00:25:17+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18423, "num_examples": 10}], "download_size": 24475, "dataset_size": 18423}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T00:25:18+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "fb-workshop-09042023-marketing-email-ds"
More Information needed
|
[
"# Dataset Card for \"fb-workshop-09042023-marketing-email-ds\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"fb-workshop-09042023-marketing-email-ds\"\n\nMore Information needed"
] |
[
6,
25
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"fb-workshop-09042023-marketing-email-ds\"\n\nMore Information needed"
] |
df6256457504c3c43360496e33dd458cb08fb437
|
## Model description
This data I crawled from these site: https://prep.vn/blog/idiom-theo-chu-de-trong-tieng-anh/ and https://www.enewsdispatch.com/
Idiom site I carefully translation, however, the enews site I use google translate
|
duwuonline/en_vi_advanced_sentences
|
[
"task_categories:translation",
"language:vi",
"language:en",
"license:other",
"region:us"
] |
2023-09-05T00:29:34+00:00
|
{"language": ["vi", "en"], "license": "other", "task_categories": ["translation"]}
|
2023-09-05T00:34:12+00:00
|
[] |
[
"vi",
"en"
] |
TAGS
#task_categories-translation #language-Vietnamese #language-English #license-other #region-us
|
## Model description
This data I crawled from these site: URL and URL
Idiom site I carefully translation, however, the enews site I use google translate
|
[
"## Model description\nThis data I crawled from these site: URL and URL\t\n\nIdiom site I carefully translation, however, the enews site I use google translate"
] |
[
"TAGS\n#task_categories-translation #language-Vietnamese #language-English #license-other #region-us \n",
"## Model description\nThis data I crawled from these site: URL and URL\t\n\nIdiom site I carefully translation, however, the enews site I use google translate"
] |
[
31,
34
] |
[
"passage: TAGS\n#task_categories-translation #language-Vietnamese #language-English #license-other #region-us \n## Model description\nThis data I crawled from these site: URL and URL\t\n\nIdiom site I carefully translation, however, the enews site I use google translate"
] |
39c0b485635990539aa7ace6e42e16f01237815d
|
# Dataset Card for "marketing-mail"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
amalina-k/marketing-mail
|
[
"region:us"
] |
2023-09-05T00:40:58+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7993, "num_examples": 5}], "download_size": 16846, "dataset_size": 7993}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T00:40:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "marketing-mail"
More Information needed
|
[
"# Dataset Card for \"marketing-mail\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"marketing-mail\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"marketing-mail\"\n\nMore Information needed"
] |
05b0562ad97c4972d209c1420d10a2bb67aaf889
|
# Dataset Card for Evaluation run of Kiddyz/testlm-3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Kiddyz/testlm-3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Kiddyz/testlm-3](https://huggingface.co/Kiddyz/testlm-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kiddyz__testlm-3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-05T01:42:44.018659](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm-3/blob/main/results_2023-09-05T01%3A42%3A44.018659.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5187353874495172,
"acc_stderr": 0.035029744575697866,
"acc_norm": 0.5224927626027567,
"acc_norm_stderr": 0.03501563654467944,
"mc1": 0.3157894736842105,
"mc1_stderr": 0.016272287957916916,
"mc2": 0.46416337242431305,
"mc2_stderr": 0.015037341919156266
},
"harness|arc:challenge|25": {
"acc": 0.5025597269624573,
"acc_stderr": 0.014611199329843784,
"acc_norm": 0.5358361774744027,
"acc_norm_stderr": 0.014573813664735718
},
"harness|hellaswag|10": {
"acc": 0.5963951404102769,
"acc_stderr": 0.004896173035943312,
"acc_norm": 0.7848038239394542,
"acc_norm_stderr": 0.00410118487096419
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731837,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793254,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793254
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.04161808503501528,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.04161808503501528
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.02397386199899208,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.02397386199899208
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5967741935483871,
"acc_stderr": 0.027906150826041146,
"acc_norm": 0.5967741935483871,
"acc_norm_stderr": 0.027906150826041146
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6414141414141414,
"acc_stderr": 0.03416903640391521,
"acc_norm": 0.6414141414141414,
"acc_norm_stderr": 0.03416903640391521
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7150259067357513,
"acc_stderr": 0.03257714077709662,
"acc_norm": 0.7150259067357513,
"acc_norm_stderr": 0.03257714077709662
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.49743589743589745,
"acc_stderr": 0.025350672979412202,
"acc_norm": 0.49743589743589745,
"acc_norm_stderr": 0.025350672979412202
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.46638655462184875,
"acc_stderr": 0.03240501447690071,
"acc_norm": 0.46638655462184875,
"acc_norm_stderr": 0.03240501447690071
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7009174311926606,
"acc_stderr": 0.019630417285415182,
"acc_norm": 0.7009174311926606,
"acc_norm_stderr": 0.019630417285415182
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.03198001660115072,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.03198001660115072
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6106870229007634,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.6106870229007634,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624505,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624505
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04668408033024931,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04668408033024931
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.6699029126213593,
"acc_stderr": 0.0465614711001235,
"acc_norm": 0.6699029126213593,
"acc_norm_stderr": 0.0465614711001235
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7521367521367521,
"acc_stderr": 0.0282863240755644,
"acc_norm": 0.7521367521367521,
"acc_norm_stderr": 0.0282863240755644
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.698595146871009,
"acc_stderr": 0.016409091097268784,
"acc_norm": 0.698595146871009,
"acc_norm_stderr": 0.016409091097268784
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.026817718130348916,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.026817718130348916
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41899441340782123,
"acc_stderr": 0.016501579306861674,
"acc_norm": 0.41899441340782123,
"acc_norm_stderr": 0.016501579306861674
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.02791705074848463,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.02791705074848463
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668773,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668773
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.02853865002887863,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.02853865002887863
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38396349413298564,
"acc_stderr": 0.012421587833134231,
"acc_norm": 0.38396349413298564,
"acc_norm_stderr": 0.012421587833134231
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.030105636570016633,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.030105636570016633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4493464052287582,
"acc_stderr": 0.020123766528027262,
"acc_norm": 0.4493464052287582,
"acc_norm_stderr": 0.020123766528027262
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789855,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789855
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355043,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355043
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4036144578313253,
"acc_stderr": 0.038194861407583984,
"acc_norm": 0.4036144578313253,
"acc_norm_stderr": 0.038194861407583984
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3157894736842105,
"mc1_stderr": 0.016272287957916916,
"mc2": 0.46416337242431305,
"mc2_stderr": 0.015037341919156266
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Kiddyz__testlm-3
|
[
"region:us"
] |
2023-09-05T00:43:07+00:00
|
{"pretty_name": "Evaluation run of Kiddyz/testlm-3", "dataset_summary": "Dataset automatically created during the evaluation run of model [Kiddyz/testlm-3](https://huggingface.co/Kiddyz/testlm-3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kiddyz__testlm-3\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-05T01:42:44.018659](https://huggingface.co/datasets/open-llm-leaderboard/details_Kiddyz__testlm-3/blob/main/results_2023-09-05T01%3A42%3A44.018659.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5187353874495172,\n \"acc_stderr\": 0.035029744575697866,\n \"acc_norm\": 0.5224927626027567,\n \"acc_norm_stderr\": 0.03501563654467944,\n \"mc1\": 0.3157894736842105,\n \"mc1_stderr\": 0.016272287957916916,\n \"mc2\": 0.46416337242431305,\n \"mc2_stderr\": 0.015037341919156266\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5025597269624573,\n \"acc_stderr\": 0.014611199329843784,\n \"acc_norm\": 0.5358361774744027,\n \"acc_norm_stderr\": 0.014573813664735718\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5963951404102769,\n \"acc_stderr\": 0.004896173035943312,\n \"acc_norm\": 0.7848038239394542,\n \"acc_norm_stderr\": 0.00410118487096419\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731837,\n \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731837\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.45664739884393063,\n \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.45664739884393063,\n \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793254,\n \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793254\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.04161808503501528,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.04161808503501528\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.31746031746031744,\n \"acc_stderr\": 0.02397386199899208,\n \"acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.02397386199899208\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5967741935483871,\n \"acc_stderr\": 0.027906150826041146,\n \"acc_norm\": 0.5967741935483871,\n \"acc_norm_stderr\": 0.027906150826041146\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6414141414141414,\n \"acc_stderr\": 0.03416903640391521,\n \"acc_norm\": 0.6414141414141414,\n \"acc_norm_stderr\": 0.03416903640391521\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7150259067357513,\n \"acc_stderr\": 0.03257714077709662,\n \"acc_norm\": 0.7150259067357513,\n \"acc_norm_stderr\": 0.03257714077709662\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.49743589743589745,\n \"acc_stderr\": 0.025350672979412202,\n \"acc_norm\": 0.49743589743589745,\n \"acc_norm_stderr\": 0.025350672979412202\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.46638655462184875,\n \"acc_stderr\": 0.03240501447690071,\n \"acc_norm\": 0.46638655462184875,\n \"acc_norm_stderr\": 0.03240501447690071\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7009174311926606,\n \"acc_stderr\": 0.019630417285415182,\n \"acc_norm\": 0.7009174311926606,\n \"acc_norm_stderr\": 0.019630417285415182\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.03198001660115072,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.03198001660115072\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.6106870229007634,\n \"acc_stderr\": 0.04276486542814591,\n \"acc_norm\": 0.6106870229007634,\n \"acc_norm_stderr\": 0.04276486542814591\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.04412015806624505,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624505\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.04668408033024931,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.04668408033024931\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.038741028598180814,\n \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.038741028598180814\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6699029126213593,\n \"acc_stderr\": 0.0465614711001235,\n \"acc_norm\": 0.6699029126213593,\n \"acc_norm_stderr\": 0.0465614711001235\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7521367521367521,\n \"acc_stderr\": 0.0282863240755644,\n \"acc_norm\": 0.7521367521367521,\n \"acc_norm_stderr\": 0.0282863240755644\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.698595146871009,\n \"acc_stderr\": 0.016409091097268784,\n \"acc_norm\": 0.698595146871009,\n \"acc_norm_stderr\": 0.016409091097268784\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.026817718130348916,\n \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.026817718130348916\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41899441340782123,\n \"acc_stderr\": 0.016501579306861674,\n \"acc_norm\": 0.41899441340782123,\n \"acc_norm_stderr\": 0.016501579306861674\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n \"acc_stderr\": 0.02791705074848463,\n \"acc_norm\": 0.5916398713826366,\n \"acc_norm_stderr\": 0.02791705074848463\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668773,\n \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668773\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887863,\n \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887863\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38396349413298564,\n \"acc_stderr\": 0.012421587833134231,\n \"acc_norm\": 0.38396349413298564,\n \"acc_norm_stderr\": 0.012421587833134231\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.030105636570016633,\n \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.030105636570016633\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4493464052287582,\n \"acc_stderr\": 0.020123766528027262,\n \"acc_norm\": 0.4493464052287582,\n \"acc_norm_stderr\": 0.020123766528027262\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789855,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789855\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.03235743789355043,\n \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.03235743789355043\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4036144578313253,\n \"acc_stderr\": 0.038194861407583984,\n \"acc_norm\": 0.4036144578313253,\n \"acc_norm_stderr\": 0.038194861407583984\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3157894736842105,\n \"mc1_stderr\": 0.016272287957916916,\n \"mc2\": 0.46416337242431305,\n \"mc2_stderr\": 0.015037341919156266\n }\n}\n```", "repo_url": "https://huggingface.co/Kiddyz/testlm-3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|arc:challenge|25_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hellaswag|10_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T01:42:44.018659.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T01:42:44.018659.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T01_42_44.018659", "path": ["results_2023-09-05T01:42:44.018659.parquet"]}, {"split": "latest", "path": ["results_2023-09-05T01:42:44.018659.parquet"]}]}]}
|
2023-09-05T00:44:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Kiddyz/testlm-3
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Kiddyz/testlm-3 on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-05T01:42:44.018659(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Kiddyz/testlm-3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Kiddyz/testlm-3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-05T01:42:44.018659(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Kiddyz/testlm-3",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Kiddyz/testlm-3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-05T01:42:44.018659(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
16,
31,
164,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Kiddyz/testlm-3## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Kiddyz/testlm-3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-05T01:42:44.018659(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
f22daba412624d6457dab7b41c7301a3376ccac6
|
# Dataset Card for "Krishanu_Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
krishanusinha20/Krishanu_Dataset
|
[
"region:us"
] |
2023-09-05T00:47:22+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20991, "num_examples": 10}], "download_size": 30027, "dataset_size": 20991}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T00:47:27+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "Krishanu_Dataset"
More Information needed
|
[
"# Dataset Card for \"Krishanu_Dataset\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"Krishanu_Dataset\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"Krishanu_Dataset\"\n\nMore Information needed"
] |
e5673a90b98d02c4832f9e836d72762f0e8933a0
|
# FinTabNet.c
The FinTabNet.c dataset was released in 2023.
You can think of FinTabNet.c as a fork (a modified version, in this case by different authors) of the original FinTabNet dataset.
FinTabNet.c contains:
- automated corrections of FinTabNet (such as canonicalization) to correct oversegmentation and to make the dataset more consistent with other TSR datasets, like PubTables-1M
- fewer samples than FinTabNet, where samples were removed whose annotations could not be either automatically processed, corrected, or verified
For more details about this version (2023) of the dataset and the adjustments made to the original dataset, please see ["Aligning benchmark datasets for table structure recognition"](https://arxiv.org/abs/2303.00716).
For the code used to create this dataset, see [https://github.com/microsoft/table-transformer](https://github.com/microsoft/table-transformer).
## Citing
If you use this dataset in your published work, please cite:
```
@article{smock2023aligning,
title={Aligning benchmark datasets for table structure recognition},
author={Smock, Brandon and Pesala, Rohith and Abraham, Robin},
booktitle={International Conference on Document Analysis and Recognition},
pages={371--386},
year={2023},
organization={Springer}
}
```
## About the original FinTabNet dataset
Please see: [https://developer.ibm.com/data/fintabnet/](https://developer.ibm.com/data/fintabnet/) (link last checked September 2023).
### Original license
According to the dataset website, the license of the original FinTabNet dataset is [CDLA-Permissive](https://cdla.dev/permissive-1-0/).
|
bsmock/FinTabNet.c
|
[
"license:cdla-permissive-2.0",
"table structure recognition",
"table extraction",
"arxiv:2303.00716",
"region:us"
] |
2023-09-05T00:50:47+00:00
|
{"license": "cdla-permissive-2.0", "tags": ["table structure recognition", "table extraction"]}
|
2023-09-07T03:50:07+00:00
|
[
"2303.00716"
] |
[] |
TAGS
#license-cdla-permissive-2.0 #table structure recognition #table extraction #arxiv-2303.00716 #region-us
|
# FinTabNet.c
The FinTabNet.c dataset was released in 2023.
You can think of FinTabNet.c as a fork (a modified version, in this case by different authors) of the original FinTabNet dataset.
FinTabNet.c contains:
- automated corrections of FinTabNet (such as canonicalization) to correct oversegmentation and to make the dataset more consistent with other TSR datasets, like PubTables-1M
- fewer samples than FinTabNet, where samples were removed whose annotations could not be either automatically processed, corrected, or verified
For more details about this version (2023) of the dataset and the adjustments made to the original dataset, please see "Aligning benchmark datasets for table structure recognition".
For the code used to create this dataset, see URL
## Citing
If you use this dataset in your published work, please cite:
## About the original FinTabNet dataset
Please see: URL (link last checked September 2023).
### Original license
According to the dataset website, the license of the original FinTabNet dataset is CDLA-Permissive.
|
[
"# FinTabNet.c\n\nThe FinTabNet.c dataset was released in 2023.\n\nYou can think of FinTabNet.c as a fork (a modified version, in this case by different authors) of the original FinTabNet dataset.\n\nFinTabNet.c contains:\n- automated corrections of FinTabNet (such as canonicalization) to correct oversegmentation and to make the dataset more consistent with other TSR datasets, like PubTables-1M\n- fewer samples than FinTabNet, where samples were removed whose annotations could not be either automatically processed, corrected, or verified\n\nFor more details about this version (2023) of the dataset and the adjustments made to the original dataset, please see \"Aligning benchmark datasets for table structure recognition\".\n\nFor the code used to create this dataset, see URL",
"## Citing\n\nIf you use this dataset in your published work, please cite:",
"## About the original FinTabNet dataset\n\nPlease see: URL (link last checked September 2023).",
"### Original license\n\nAccording to the dataset website, the license of the original FinTabNet dataset is CDLA-Permissive."
] |
[
"TAGS\n#license-cdla-permissive-2.0 #table structure recognition #table extraction #arxiv-2303.00716 #region-us \n",
"# FinTabNet.c\n\nThe FinTabNet.c dataset was released in 2023.\n\nYou can think of FinTabNet.c as a fork (a modified version, in this case by different authors) of the original FinTabNet dataset.\n\nFinTabNet.c contains:\n- automated corrections of FinTabNet (such as canonicalization) to correct oversegmentation and to make the dataset more consistent with other TSR datasets, like PubTables-1M\n- fewer samples than FinTabNet, where samples were removed whose annotations could not be either automatically processed, corrected, or verified\n\nFor more details about this version (2023) of the dataset and the adjustments made to the original dataset, please see \"Aligning benchmark datasets for table structure recognition\".\n\nFor the code used to create this dataset, see URL",
"## Citing\n\nIf you use this dataset in your published work, please cite:",
"## About the original FinTabNet dataset\n\nPlease see: URL (link last checked September 2023).",
"### Original license\n\nAccording to the dataset website, the license of the original FinTabNet dataset is CDLA-Permissive."
] |
[
34,
191,
17,
21,
29
] |
[
"passage: TAGS\n#license-cdla-permissive-2.0 #table structure recognition #table extraction #arxiv-2303.00716 #region-us \n# FinTabNet.c\n\nThe FinTabNet.c dataset was released in 2023.\n\nYou can think of FinTabNet.c as a fork (a modified version, in this case by different authors) of the original FinTabNet dataset.\n\nFinTabNet.c contains:\n- automated corrections of FinTabNet (such as canonicalization) to correct oversegmentation and to make the dataset more consistent with other TSR datasets, like PubTables-1M\n- fewer samples than FinTabNet, where samples were removed whose annotations could not be either automatically processed, corrected, or verified\n\nFor more details about this version (2023) of the dataset and the adjustments made to the original dataset, please see \"Aligning benchmark datasets for table structure recognition\".\n\nFor the code used to create this dataset, see URL## Citing\n\nIf you use this dataset in your published work, please cite:## About the original FinTabNet dataset\n\nPlease see: URL (link last checked September 2023).### Original license\n\nAccording to the dataset website, the license of the original FinTabNet dataset is CDLA-Permissive."
] |
284c982cacb30cde3a6eb5913ffe97fd0098a813
|
# Dataset Card for "marketing_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
kuokxuen/marketing_dataset
|
[
"region:us"
] |
2023-09-05T00:52:20+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 197836, "num_examples": 100}], "download_size": 120464, "dataset_size": 197836}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T00:52:22+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "marketing_dataset"
More Information needed
|
[
"# Dataset Card for \"marketing_dataset\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"marketing_dataset\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"marketing_dataset\"\n\nMore Information needed"
] |
d8b1bb3b6b074eb3488741d604ff15a5e400c571
|
# mpi3d_real
This repository is a _unofficial_ backup service for `MPI3D-Real` dataset, which is provided in this paper: [**On the Transfer of Inductive Bias from Simulation to the Real World: a New Disentanglement Dataset**](https://proceedings.neurips.cc/paper/2019/hash/d97d404b6119214e4a7018391195240a-Abstract.html). The dataset contains images of real objects with varying factors of variation, such as shape, color, texture and pose. The dataset is useful for studying the generalization and disentanglement abilities of representation learning models. The backup service allows users to download the dataset from a mirror site in case the original source is unavailable.
For more detailed information on the dataset, please check the [original repository](https://github.com/rr-learning/disentanglement_dataset)
## Reference
[1] Gondal, Muhammad Waleed, et al. "On the transfer of inductive bias from simulation to the real world: a new disentanglement dataset." Advances in Neural Information Processing Systems 32 (2019).
|
cun-bjy/mpi3d_real
|
[
"task_categories:feature-extraction",
"size_categories:100M<n<1B",
"language:ar",
"license:bsd",
"code",
"region:us"
] |
2023-09-05T01:36:10+00:00
|
{"language": ["ar"], "license": "bsd", "size_categories": ["100M<n<1B"], "task_categories": ["feature-extraction"], "pretty_name": "mpi3d_real", "tags": ["code"]}
|
2023-09-05T02:49:03+00:00
|
[] |
[
"ar"
] |
TAGS
#task_categories-feature-extraction #size_categories-100M<n<1B #language-Arabic #license-bsd #code #region-us
|
# mpi3d_real
This repository is a _unofficial_ backup service for 'MPI3D-Real' dataset, which is provided in this paper: On the Transfer of Inductive Bias from Simulation to the Real World: a New Disentanglement Dataset. The dataset contains images of real objects with varying factors of variation, such as shape, color, texture and pose. The dataset is useful for studying the generalization and disentanglement abilities of representation learning models. The backup service allows users to download the dataset from a mirror site in case the original source is unavailable.
For more detailed information on the dataset, please check the original repository
## Reference
[1] Gondal, Muhammad Waleed, et al. "On the transfer of inductive bias from simulation to the real world: a new disentanglement dataset." Advances in Neural Information Processing Systems 32 (2019).
|
[
"# mpi3d_real\nThis repository is a _unofficial_ backup service for 'MPI3D-Real' dataset, which is provided in this paper: On the Transfer of Inductive Bias from Simulation to the Real World: a New Disentanglement Dataset. The dataset contains images of real objects with varying factors of variation, such as shape, color, texture and pose. The dataset is useful for studying the generalization and disentanglement abilities of representation learning models. The backup service allows users to download the dataset from a mirror site in case the original source is unavailable.\n\nFor more detailed information on the dataset, please check the original repository",
"## Reference\n[1] Gondal, Muhammad Waleed, et al. \"On the transfer of inductive bias from simulation to the real world: a new disentanglement dataset.\" Advances in Neural Information Processing Systems 32 (2019)."
] |
[
"TAGS\n#task_categories-feature-extraction #size_categories-100M<n<1B #language-Arabic #license-bsd #code #region-us \n",
"# mpi3d_real\nThis repository is a _unofficial_ backup service for 'MPI3D-Real' dataset, which is provided in this paper: On the Transfer of Inductive Bias from Simulation to the Real World: a New Disentanglement Dataset. The dataset contains images of real objects with varying factors of variation, such as shape, color, texture and pose. The dataset is useful for studying the generalization and disentanglement abilities of representation learning models. The backup service allows users to download the dataset from a mirror site in case the original source is unavailable.\n\nFor more detailed information on the dataset, please check the original repository",
"## Reference\n[1] Gondal, Muhammad Waleed, et al. \"On the transfer of inductive bias from simulation to the real world: a new disentanglement dataset.\" Advances in Neural Information Processing Systems 32 (2019)."
] |
[
43,
153,
51
] |
[
"passage: TAGS\n#task_categories-feature-extraction #size_categories-100M<n<1B #language-Arabic #license-bsd #code #region-us \n# mpi3d_real\nThis repository is a _unofficial_ backup service for 'MPI3D-Real' dataset, which is provided in this paper: On the Transfer of Inductive Bias from Simulation to the Real World: a New Disentanglement Dataset. The dataset contains images of real objects with varying factors of variation, such as shape, color, texture and pose. The dataset is useful for studying the generalization and disentanglement abilities of representation learning models. The backup service allows users to download the dataset from a mirror site in case the original source is unavailable.\n\nFor more detailed information on the dataset, please check the original repository## Reference\n[1] Gondal, Muhammad Waleed, et al. \"On the transfer of inductive bias from simulation to the real world: a new disentanglement dataset.\" Advances in Neural Information Processing Systems 32 (2019)."
] |
b070f109e4ac89243fbc389f4952bef5a92f05ac
|
# Dataset Card for Evaluation run of Undi95/Nous-Hermes-13B-Code
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/Nous-Hermes-13B-Code
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/Nous-Hermes-13B-Code](https://huggingface.co/Undi95/Nous-Hermes-13B-Code) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T01:46:49.269980](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code/blob/main/results_2023-10-17T01-46-49.269980.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.19043624161073824,
"em_stderr": 0.004021054701391535,
"f1": 0.28277894295302086,
"f1_stderr": 0.004086388636430754,
"acc": 0.42762389052479904,
"acc_stderr": 0.010275468471163573
},
"harness|drop|3": {
"em": 0.19043624161073824,
"em_stderr": 0.004021054701391535,
"f1": 0.28277894295302086,
"f1_stderr": 0.004086388636430754
},
"harness|gsm8k|5": {
"acc": 0.10386656557998483,
"acc_stderr": 0.008403622228924035
},
"harness|winogrande|5": {
"acc": 0.7513812154696132,
"acc_stderr": 0.012147314713403108
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code
|
[
"region:us"
] |
2023-09-05T01:42:25+00:00
|
{"pretty_name": "Evaluation run of Undi95/Nous-Hermes-13B-Code", "dataset_summary": "Dataset automatically created during the evaluation run of model [Undi95/Nous-Hermes-13B-Code](https://huggingface.co/Undi95/Nous-Hermes-13B-Code) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-17T01:46:49.269980](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__Nous-Hermes-13B-Code/blob/main/results_2023-10-17T01-46-49.269980.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.19043624161073824,\n \"em_stderr\": 0.004021054701391535,\n \"f1\": 0.28277894295302086,\n \"f1_stderr\": 0.004086388636430754,\n \"acc\": 0.42762389052479904,\n \"acc_stderr\": 0.010275468471163573\n },\n \"harness|drop|3\": {\n \"em\": 0.19043624161073824,\n \"em_stderr\": 0.004021054701391535,\n \"f1\": 0.28277894295302086,\n \"f1_stderr\": 0.004086388636430754\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10386656557998483,\n \"acc_stderr\": 0.008403622228924035\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7513812154696132,\n \"acc_stderr\": 0.012147314713403108\n }\n}\n```", "repo_url": "https://huggingface.co/Undi95/Nous-Hermes-13B-Code", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|arc:challenge|25_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_17T01_46_49.269980", "path": ["**/details_harness|drop|3_2023-10-17T01-46-49.269980.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-17T01-46-49.269980.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_17T01_46_49.269980", "path": ["**/details_harness|gsm8k|5_2023-10-17T01-46-49.269980.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-17T01-46-49.269980.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hellaswag|10_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T02:42:01.860222.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T02:42:01.860222.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T02:42:01.860222.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_17T01_46_49.269980", "path": ["**/details_harness|winogrande|5_2023-10-17T01-46-49.269980.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-17T01-46-49.269980.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T02_42_01.860222", "path": ["results_2023-09-05T02:42:01.860222.parquet"]}, {"split": "2023_10_17T01_46_49.269980", "path": ["results_2023-10-17T01-46-49.269980.parquet"]}, {"split": "latest", "path": ["results_2023-10-17T01-46-49.269980.parquet"]}]}]}
|
2023-10-17T00:47:02+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Undi95/Nous-Hermes-13B-Code
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Undi95/Nous-Hermes-13B-Code on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-17T01:46:49.269980(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Undi95/Nous-Hermes-13B-Code",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/Nous-Hermes-13B-Code on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T01:46:49.269980(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Undi95/Nous-Hermes-13B-Code",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/Nous-Hermes-13B-Code on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-17T01:46:49.269980(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Undi95/Nous-Hermes-13B-Code## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/Nous-Hermes-13B-Code on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-17T01:46:49.269980(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
9d4ca8f75eca6234fb0718c21754d48f603f1ec2
|
# Dataset Card for Evaluation run of Undi95/LewdEngine
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/LewdEngine
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/LewdEngine](https://huggingface.co/Undi95/LewdEngine) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__LewdEngine",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T07:14:30.015522](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__LewdEngine/blob/main/results_2023-10-18T07-14-30.015522.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666989,
"f1": 0.06167575503355703,
"f1_stderr": 0.0013753579135200263,
"acc": 0.4362959430292375,
"acc_stderr": 0.010625413263646535
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.0004566676462666989,
"f1": 0.06167575503355703,
"f1_stderr": 0.0013753579135200263
},
"harness|gsm8k|5": {
"acc": 0.12357846853677028,
"acc_stderr": 0.00906505030677692
},
"harness|winogrande|5": {
"acc": 0.7490134175217048,
"acc_stderr": 0.012185776220516151
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Undi95__LewdEngine
|
[
"region:us"
] |
2023-09-05T01:56:47+00:00
|
{"pretty_name": "Evaluation run of Undi95/LewdEngine", "dataset_summary": "Dataset automatically created during the evaluation run of model [Undi95/LewdEngine](https://huggingface.co/Undi95/LewdEngine) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__LewdEngine\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T07:14:30.015522](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__LewdEngine/blob/main/results_2023-10-18T07-14-30.015522.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462666989,\n \"f1\": 0.06167575503355703,\n \"f1_stderr\": 0.0013753579135200263,\n \"acc\": 0.4362959430292375,\n \"acc_stderr\": 0.010625413263646535\n },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.0004566676462666989,\n \"f1\": 0.06167575503355703,\n \"f1_stderr\": 0.0013753579135200263\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12357846853677028,\n \"acc_stderr\": 0.00906505030677692\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7490134175217048,\n \"acc_stderr\": 0.012185776220516151\n }\n}\n```", "repo_url": "https://huggingface.co/Undi95/LewdEngine", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|arc:challenge|25_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T07_14_30.015522", "path": ["**/details_harness|drop|3_2023-10-18T07-14-30.015522.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T07-14-30.015522.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T07_14_30.015522", "path": ["**/details_harness|gsm8k|5_2023-10-18T07-14-30.015522.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T07-14-30.015522.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hellaswag|10_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T02:56:23.442470.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T02:56:23.442470.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T07_14_30.015522", "path": ["**/details_harness|winogrande|5_2023-10-18T07-14-30.015522.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T07-14-30.015522.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T02_56_23.442470", "path": ["results_2023-09-05T02:56:23.442470.parquet"]}, {"split": "2023_10_18T07_14_30.015522", "path": ["results_2023-10-18T07-14-30.015522.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T07-14-30.015522.parquet"]}]}]}
|
2023-10-18T06:14:42+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Undi95/LewdEngine
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Undi95/LewdEngine on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T07:14:30.015522(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Undi95/LewdEngine",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/LewdEngine on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T07:14:30.015522(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Undi95/LewdEngine",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/LewdEngine on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T07:14:30.015522(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Undi95/LewdEngine## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/LewdEngine on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T07:14:30.015522(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
d1eb65743b37ea4100d0749fbc47de01e6fba30f
|
# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-sft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Mikivis/gpt2-large-lora-sft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Mikivis/gpt2-large-lora-sft](https://huggingface.co/Mikivis/gpt2-large-lora-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-27T18:03:42.739284](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft/blob/main/results_2023-10-27T18-03-42.739284.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965714,
"f1": 0.05463401845637592,
"f1_stderr": 0.001420933825490078,
"acc": 0.27545382794001577,
"acc_stderr": 0.006989729694570417
},
"harness|drop|3": {
"em": 0.0026216442953020135,
"em_stderr": 0.0005236685642965714,
"f1": 0.05463401845637592,
"f1_stderr": 0.001420933825490078
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5509076558800315,
"acc_stderr": 0.013979459389140834
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft
|
[
"region:us"
] |
2023-09-05T02:15:54+00:00
|
{"pretty_name": "Evaluation run of Mikivis/gpt2-large-lora-sft", "dataset_summary": "Dataset automatically created during the evaluation run of model [Mikivis/gpt2-large-lora-sft](https://huggingface.co/Mikivis/gpt2-large-lora-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-27T18:03:42.739284](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__gpt2-large-lora-sft/blob/main/results_2023-10-27T18-03-42.739284.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965714,\n \"f1\": 0.05463401845637592,\n \"f1_stderr\": 0.001420933825490078,\n \"acc\": 0.27545382794001577,\n \"acc_stderr\": 0.006989729694570417\n },\n \"harness|drop|3\": {\n \"em\": 0.0026216442953020135,\n \"em_stderr\": 0.0005236685642965714,\n \"f1\": 0.05463401845637592,\n \"f1_stderr\": 0.001420933825490078\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5509076558800315,\n \"acc_stderr\": 0.013979459389140834\n }\n}\n```", "repo_url": "https://huggingface.co/Mikivis/gpt2-large-lora-sft", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|arc:challenge|25_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_27T18_03_42.739284", "path": ["**/details_harness|drop|3_2023-10-27T18-03-42.739284.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-27T18-03-42.739284.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_27T18_03_42.739284", "path": ["**/details_harness|gsm8k|5_2023-10-27T18-03-42.739284.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-27T18-03-42.739284.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hellaswag|10_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T03:15:39.228135.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T03:15:39.228135.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T03:15:39.228135.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_27T18_03_42.739284", "path": ["**/details_harness|winogrande|5_2023-10-27T18-03-42.739284.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-27T18-03-42.739284.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T03_15_39.228135", "path": ["results_2023-09-05T03:15:39.228135.parquet"]}, {"split": "2023_10_27T18_03_42.739284", "path": ["results_2023-10-27T18-03-42.739284.parquet"]}, {"split": "latest", "path": ["results_2023-10-27T18-03-42.739284.parquet"]}]}]}
|
2023-10-27T17:03:55+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-sft
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Mikivis/gpt2-large-lora-sft on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-27T18:03:42.739284(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-sft",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikivis/gpt2-large-lora-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-27T18:03:42.739284(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-sft",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikivis/gpt2-large-lora-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-27T18:03:42.739284(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
24,
31,
172,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Mikivis/gpt2-large-lora-sft## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Mikivis/gpt2-large-lora-sft on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-27T18:03:42.739284(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
5cb65624776a9748bf55372cc74efa4276458e7a
|
# BLOSSOM MATH V2
### 介绍
[Blossom Math V3](https://huggingface.co/datasets/Azure99/blossom-math-v3)版本已发布!🤗
Blossom Math V2是基于Math23K和GSM8K衍生而来的中英双语数学对话数据集,适用于数学问题微调。
相比于blossom-math-v1,新增了2500条GSM8K数据和翻译为中文的2500条GSM8K-CN数据。此外,优化了答案的检查逻辑,还移除了<<1+1=2>>等计算步骤,以统一推理步骤的风格。
本数据集采用全量Math23K、GSM8K和翻译后的GSM8K的问题,随后调用gpt-3.5-turbo-0613生成结果,并使用原始数据集中的答案对生成的结果进行验证,过滤掉错误答案,很大程度上保证了问题和答案的准确性。
本次发布了全量数据的25%,包含10K记录。
### 语言
中文和英文
### 数据集结构
每条数据代表一个完整的题目及答案,包含id、input、output、answer、dataset四个字段。
- id:字符串,代表原始数据集中的题目id,与dataset字段结合可确定唯一题目。
- input:字符串,代表问题。
- output:字符串,代表gpt-3.5-turbo-0613生成的答案。
- answer:字符串,代表正确答案。
- dataset:字符串,代表原始数据集。
### 数据集限制
本数据集的所有响应均由gpt-3.5-turbo-0613生成,并经过初步校验,但仍可能包含不准确的回答。
|
Azure99/blossom-math-v2
|
[
"task_categories:text-generation",
"task_categories:text2text-generation",
"size_categories:10K<n<100K",
"language:zh",
"license:apache-2.0",
"region:us"
] |
2023-09-05T02:19:29+00:00
|
{"language": ["zh"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation", "text2text-generation"]}
|
2023-12-21T13:12:54+00:00
|
[] |
[
"zh"
] |
TAGS
#task_categories-text-generation #task_categories-text2text-generation #size_categories-10K<n<100K #language-Chinese #license-apache-2.0 #region-us
|
# BLOSSOM MATH V2
### 介绍
Blossom Math V3版本已发布!
Blossom Math V2是基于Math23K和GSM8K衍生而来的中英双语数学对话数据集,适用于数学问题微调。
相比于blossom-math-v1,新增了2500条GSM8K数据和翻译为中文的2500条GSM8K-CN数据。此外,优化了答案的检查逻辑,还移除了<<1+1=2>>等计算步骤,以统一推理步骤的风格。
本数据集采用全量Math23K、GSM8K和翻译后的GSM8K的问题,随后调用gpt-3.5-turbo-0613生成结果,并使用原始数据集中的答案对生成的结果进行验证,过滤掉错误答案,很大程度上保证了问题和答案的准确性。
本次发布了全量数据的25%,包含10K记录。
### 语言
中文和英文
### 数据集结构
每条数据代表一个完整的题目及答案,包含id、input、output、answer、dataset四个字段。
- id:字符串,代表原始数据集中的题目id,与dataset字段结合可确定唯一题目。
- input:字符串,代表问题。
- output:字符串,代表gpt-3.5-turbo-0613生成的答案。
- answer:字符串,代表正确答案。
- dataset:字符串,代表原始数据集。
### 数据集限制
本数据集的所有响应均由gpt-3.5-turbo-0613生成,并经过初步校验,但仍可能包含不准确的回答。
|
[
"# BLOSSOM MATH V2",
"### 介绍\n\nBlossom Math V3版本已发布!\n\nBlossom Math V2是基于Math23K和GSM8K衍生而来的中英双语数学对话数据集,适用于数学问题微调。\n\n相比于blossom-math-v1,新增了2500条GSM8K数据和翻译为中文的2500条GSM8K-CN数据。此外,优化了答案的检查逻辑,还移除了<<1+1=2>>等计算步骤,以统一推理步骤的风格。\n\n本数据集采用全量Math23K、GSM8K和翻译后的GSM8K的问题,随后调用gpt-3.5-turbo-0613生成结果,并使用原始数据集中的答案对生成的结果进行验证,过滤掉错误答案,很大程度上保证了问题和答案的准确性。\n\n本次发布了全量数据的25%,包含10K记录。",
"### 语言\n\n中文和英文",
"### 数据集结构\n\n每条数据代表一个完整的题目及答案,包含id、input、output、answer、dataset四个字段。\n\n- id:字符串,代表原始数据集中的题目id,与dataset字段结合可确定唯一题目。\n- input:字符串,代表问题。\n- output:字符串,代表gpt-3.5-turbo-0613生成的答案。\n- answer:字符串,代表正确答案。\n- dataset:字符串,代表原始数据集。",
"### 数据集限制\n\n本数据集的所有响应均由gpt-3.5-turbo-0613生成,并经过初步校验,但仍可能包含不准确的回答。"
] |
[
"TAGS\n#task_categories-text-generation #task_categories-text2text-generation #size_categories-10K<n<100K #language-Chinese #license-apache-2.0 #region-us \n",
"# BLOSSOM MATH V2",
"### 介绍\n\nBlossom Math V3版本已发布!\n\nBlossom Math V2是基于Math23K和GSM8K衍生而来的中英双语数学对话数据集,适用于数学问题微调。\n\n相比于blossom-math-v1,新增了2500条GSM8K数据和翻译为中文的2500条GSM8K-CN数据。此外,优化了答案的检查逻辑,还移除了<<1+1=2>>等计算步骤,以统一推理步骤的风格。\n\n本数据集采用全量Math23K、GSM8K和翻译后的GSM8K的问题,随后调用gpt-3.5-turbo-0613生成结果,并使用原始数据集中的答案对生成的结果进行验证,过滤掉错误答案,很大程度上保证了问题和答案的准确性。\n\n本次发布了全量数据的25%,包含10K记录。",
"### 语言\n\n中文和英文",
"### 数据集结构\n\n每条数据代表一个完整的题目及答案,包含id、input、output、answer、dataset四个字段。\n\n- id:字符串,代表原始数据集中的题目id,与dataset字段结合可确定唯一题目。\n- input:字符串,代表问题。\n- output:字符串,代表gpt-3.5-turbo-0613生成的答案。\n- answer:字符串,代表正确答案。\n- dataset:字符串,代表原始数据集。",
"### 数据集限制\n\n本数据集的所有响应均由gpt-3.5-turbo-0613生成,并经过初步校验,但仍可能包含不准确的回答。"
] |
[
55,
8,
195,
8,
116,
39
] |
[
"passage: TAGS\n#task_categories-text-generation #task_categories-text2text-generation #size_categories-10K<n<100K #language-Chinese #license-apache-2.0 #region-us \n# BLOSSOM MATH V2### 介绍\n\nBlossom Math V3版本已发布!\n\nBlossom Math V2是基于Math23K和GSM8K衍生而来的中英双语数学对话数据集,适用于数学问题微调。\n\n相比于blossom-math-v1,新增了2500条GSM8K数据和翻译为中文的2500条GSM8K-CN数据。此外,优化了答案的检查逻辑,还移除了<<1+1=2>>等计算步骤,以统一推理步骤的风格。\n\n本数据集采用全量Math23K、GSM8K和翻译后的GSM8K的问题,随后调用gpt-3.5-turbo-0613生成结果,并使用原始数据集中的答案对生成的结果进行验证,过滤掉错误答案,很大程度上保证了问题和答案的准确性。\n\n本次发布了全量数据的25%,包含10K记录。### 语言\n\n中文和英文### 数据集结构\n\n每条数据代表一个完整的题目及答案,包含id、input、output、answer、dataset四个字段。\n\n- id:字符串,代表原始数据集中的题目id,与dataset字段结合可确定唯一题目。\n- input:字符串,代表问题。\n- output:字符串,代表gpt-3.5-turbo-0613生成的答案。\n- answer:字符串,代表正确答案。\n- dataset:字符串,代表原始数据集。### 数据集限制\n\n本数据集的所有响应均由gpt-3.5-turbo-0613生成,并经过初步校验,但仍可能包含不准确的回答。"
] |
cf6811379bbdd94b0c79e59626fdd733593bc6b5
|
# Dataset Card for "keywords_daily_dialog2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
edmundtsou/keywords_daily_dialog2
|
[
"region:us"
] |
2023-09-05T02:36:34+00:00
|
{"dataset_info": {"features": [{"name": "dialog", "sequence": "string"}, {"name": "ids", "dtype": "int64"}, {"name": "keywords", "sequence": "string"}], "splits": [{"name": "train", "num_bytes": 9751227, "num_examples": 13118}], "download_size": 5200319, "dataset_size": 9751227}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T02:36:36+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "keywords_daily_dialog2"
More Information needed
|
[
"# Dataset Card for \"keywords_daily_dialog2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"keywords_daily_dialog2\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"keywords_daily_dialog2\"\n\nMore Information needed"
] |
4c3ce05548baa23f54e8e670d7c2015fa0d9c0d4
|
# Dataset Card for "marketing_email_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
markmp/marketing_email_test
|
[
"region:us"
] |
2023-09-05T02:38:28+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13830, "num_examples": 10}], "download_size": 18502, "dataset_size": 13830}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T02:38:29+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "marketing_email_test"
More Information needed
|
[
"# Dataset Card for \"marketing_email_test\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"marketing_email_test\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"marketing_email_test\"\n\nMore Information needed"
] |
dc80b432f32f018f3b745ae241d99af3403723d4
|
# Dataset Card for "marketing_emails"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
krishanusinha20/marketing_emails
|
[
"region:us"
] |
2023-09-05T02:40:25+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 20941, "num_examples": 10}], "download_size": 26509, "dataset_size": 20941}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T02:40:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "marketing_emails"
More Information needed
|
[
"# Dataset Card for \"marketing_emails\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"marketing_emails\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"marketing_emails\"\n\nMore Information needed"
] |
ddbf8e3390cd61562ab582e3a5dbae09753df0e0
|
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-airoboros-13b-0.10e
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-codellama-orca-airoboros-13b-0.10e
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-orca-airoboros-13b-0.10e](https://huggingface.co/uukuguy/speechless-codellama-orca-airoboros-13b-0.10e) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-airoboros-13b-0.10e",
"harness_gsm8k_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-03T19:30:46.049775](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-airoboros-13b-0.10e/blob/main/results_2023-12-03T19-30-46.049775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-airoboros-13b-0.10e
|
[
"region:us"
] |
2023-09-05T02:40:32+00:00
|
{"pretty_name": "Evaluation run of uukuguy/speechless-codellama-orca-airoboros-13b-0.10e", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-orca-airoboros-13b-0.10e](https://huggingface.co/uukuguy/speechless-codellama-orca-airoboros-13b-0.10e) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-airoboros-13b-0.10e\",\n\t\"harness_gsm8k_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-03T19:30:46.049775](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-orca-airoboros-13b-0.10e/blob/main/results_2023-12-03T19-30-46.049775.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-codellama-orca-airoboros-13b-0.10e", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|arc:challenge|25_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|arc:challenge|25_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T02_12_43.735016", "path": ["**/details_harness|drop|3_2023-10-18T02-12-43.735016.parquet"]}, {"split": "2023_10_28T09_11_54.446220", "path": ["**/details_harness|drop|3_2023-10-28T09-11-54.446220.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-28T09-11-54.446220.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T02_12_43.735016", "path": ["**/details_harness|gsm8k|5_2023-10-18T02-12-43.735016.parquet"]}, {"split": "2023_10_28T09_11_54.446220", "path": ["**/details_harness|gsm8k|5_2023-10-28T09-11-54.446220.parquet"]}, {"split": "2023_12_03T19_08_06.034957", "path": ["**/details_harness|gsm8k|5_2023-12-03T19-08-06.034957.parquet"]}, {"split": "2023_12_03T19_08_12.373310", "path": ["**/details_harness|gsm8k|5_2023-12-03T19-08-12.373310.parquet"]}, {"split": "2023_12_03T19_30_19.333310", "path": ["**/details_harness|gsm8k|5_2023-12-03T19-30-19.333310.parquet"]}, {"split": "2023_12_03T19_30_46.049775", "path": ["**/details_harness|gsm8k|5_2023-12-03T19-30-46.049775.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-03T19-30-46.049775.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hellaswag|10_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hellaswag|10_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T03:40:07.595318.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-12T14-42-21.510480.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T14-42-21.510480.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-12T14-42-21.510480.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T02_12_43.735016", "path": ["**/details_harness|winogrande|5_2023-10-18T02-12-43.735016.parquet"]}, {"split": "2023_10_28T09_11_54.446220", "path": ["**/details_harness|winogrande|5_2023-10-28T09-11-54.446220.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-28T09-11-54.446220.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T03_40_07.595318", "path": ["results_2023-09-05T03:40:07.595318.parquet"]}, {"split": "2023_09_12T14_42_21.510480", "path": ["results_2023-09-12T14-42-21.510480.parquet"]}, {"split": "2023_10_18T02_12_43.735016", "path": ["results_2023-10-18T02-12-43.735016.parquet"]}, {"split": "2023_10_28T09_11_54.446220", "path": ["results_2023-10-28T09-11-54.446220.parquet"]}, {"split": "2023_12_03T19_08_06.034957", "path": ["results_2023-12-03T19-08-06.034957.parquet"]}, {"split": "2023_12_03T19_08_12.373310", "path": ["results_2023-12-03T19-08-12.373310.parquet"]}, {"split": "2023_12_03T19_30_19.333310", "path": ["results_2023-12-03T19-30-19.333310.parquet"]}, {"split": "2023_12_03T19_30_46.049775", "path": ["results_2023-12-03T19-30-46.049775.parquet"]}, {"split": "latest", "path": ["results_2023-12-03T19-30-46.049775.parquet"]}]}]}
|
2023-12-03T19:30:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-airoboros-13b-0.10e
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model uukuguy/speechless-codellama-orca-airoboros-13b-0.10e on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-03T19:30:46.049775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-airoboros-13b-0.10e",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-codellama-orca-airoboros-13b-0.10e on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-03T19:30:46.049775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-airoboros-13b-0.10e",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-codellama-orca-airoboros-13b-0.10e on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-03T19:30:46.049775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
34,
31,
183,
68,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-codellama-orca-airoboros-13b-0.10e## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-codellama-orca-airoboros-13b-0.10e on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 8 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-03T19:30:46.049775(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
f3bcb219aab6a6b492ebfe2ded45b842cd3acac3
|
{"input": "What color is the sky?", "output": "The sky is blue."}
{"input": "Where is the best place to get cloud GPUs?", "output": "Brev.dev"}
{"input": "Why do Americans love guns so much?", "output": "Because of the Spanish."}
|
tnash6/test
|
[
"region:us"
] |
2023-09-05T02:41:38+00:00
|
{}
|
2023-09-05T02:42:18+00:00
|
[] |
[] |
TAGS
#region-us
|
{"input": "What color is the sky?", "output": "The sky is blue."}
{"input": "Where is the best place to get cloud GPUs?", "output": "URL"}
{"input": "Why do Americans love guns so much?", "output": "Because of the Spanish."}
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
62a31325e2d13b89d96d589c76ba9e3201a763af
|
# Dataset Card for "generativeai_sample_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
jschew39/generativeai_sample_data
|
[
"region:us"
] |
2023-09-05T02:41:51+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 23408, "num_examples": 12}], "download_size": 27052, "dataset_size": 23408}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T02:41:53+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "generativeai_sample_data"
More Information needed
|
[
"# Dataset Card for \"generativeai_sample_data\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"generativeai_sample_data\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"generativeai_sample_data\"\n\nMore Information needed"
] |
ae8f6a9b469731ff5f1bb6c4cc17bdc98479e169
|
testing!
|
yanabels/churchill-data
|
[
"license:apache-2.0",
"region:us"
] |
2023-09-05T02:48:15+00:00
|
{"license": "apache-2.0", "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 5061, "num_examples": 17}], "download_size": 5411, "dataset_size": 5061}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T03:04:33+00:00
|
[] |
[] |
TAGS
#license-apache-2.0 #region-us
|
testing!
|
[] |
[
"TAGS\n#license-apache-2.0 #region-us \n"
] |
[
14
] |
[
"passage: TAGS\n#license-apache-2.0 #region-us \n"
] |
82ab46cb5151ff4faf49d80f560a963684cc4d57
|
# Dataset Card for "physics4kids"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
lohleonard93/physics4kids
|
[
"region:us"
] |
2023-09-05T02:53:18+00:00
|
{"dataset_info": {"features": [{"name": "topics", "dtype": "string"}, {"name": "explain", "dtype": "string"}, {"name": "simplified", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13730, "num_examples": 10}], "download_size": 17516, "dataset_size": 13730}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T02:53:20+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "physics4kids"
More Information needed
|
[
"# Dataset Card for \"physics4kids\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"physics4kids\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"physics4kids\"\n\nMore Information needed"
] |
31ea39720df3edc5b8e2beb9a61d546922387a4d
|
# Dataset Card for "hf_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
MayG/hf_dataset
|
[
"region:us"
] |
2023-09-05T02:58:37+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 19405, "num_examples": 10}], "download_size": 26542, "dataset_size": 19405}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T02:58:39+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "hf_dataset"
More Information needed
|
[
"# Dataset Card for \"hf_dataset\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"hf_dataset\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"hf_dataset\"\n\nMore Information needed"
] |
de5b0e6f37e365265ee50682e11c70cfe1d57c87
|
# Dataset Card for "marketmail"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
kedargsm/marketmail
|
[
"region:us"
] |
2023-09-05T03:00:56+00:00
|
{"dataset_info": {"features": [{"name": "product", "dtype": "string"}, {"name": "description", "dtype": "string"}, {"name": "marketing_email", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 97243, "num_examples": 50}], "download_size": 68524, "dataset_size": 97243}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T03:00:58+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "marketmail"
More Information needed
|
[
"# Dataset Card for \"marketmail\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"marketmail\"\n\nMore Information needed"
] |
[
6,
12
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"marketmail\"\n\nMore Information needed"
] |
054d59d0c8d050993a6d7c8ca721216ed9c80e7a
|
# Dataset Card for "starwars_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
izaq09/starwars_dataset
|
[
"region:us"
] |
2023-09-05T03:01:06+00:00
|
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2966960.0, "num_examples": 7}], "download_size": 2933224, "dataset_size": 2966960.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-08T12:30:31+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "starwars_dataset"
More Information needed
|
[
"# Dataset Card for \"starwars_dataset\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"starwars_dataset\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"starwars_dataset\"\n\nMore Information needed"
] |
21324450a91d90f64ef6fff814f4decd596d48a3
|
The dataset comes from the work introduced in "Shepherd: A Critic for Language Model Generation". We translated it into Simplified Chinese based on Google Translate, and made appropriate manual checks. We hope to do more valuable work in the Chinese field, and at the same time, we also hope that capable researchers can better check the sentences based on Chinese grammar or make further rewrites.
|
frankminors123/chinese-shepherd-critic-dataset
|
[
"task_categories:text-generation",
"task_categories:question-answering",
"size_categories:1M<n<10M",
"language:zh",
"license:apache-2.0",
"region:us"
] |
2023-09-05T03:06:41+00:00
|
{"language": ["zh"], "license": "apache-2.0", "size_categories": ["1M<n<10M"], "task_categories": ["text-generation", "question-answering"]}
|
2023-09-05T06:45:55+00:00
|
[] |
[
"zh"
] |
TAGS
#task_categories-text-generation #task_categories-question-answering #size_categories-1M<n<10M #language-Chinese #license-apache-2.0 #region-us
|
The dataset comes from the work introduced in "Shepherd: A Critic for Language Model Generation". We translated it into Simplified Chinese based on Google Translate, and made appropriate manual checks. We hope to do more valuable work in the Chinese field, and at the same time, we also hope that capable researchers can better check the sentences based on Chinese grammar or make further rewrites.
|
[] |
[
"TAGS\n#task_categories-text-generation #task_categories-question-answering #size_categories-1M<n<10M #language-Chinese #license-apache-2.0 #region-us \n"
] |
[
54
] |
[
"passage: TAGS\n#task_categories-text-generation #task_categories-question-answering #size_categories-1M<n<10M #language-Chinese #license-apache-2.0 #region-us \n"
] |
a8d9ab84ae1be7b22db75fc105b12bae99a040aa
|
# Dataset Card for "Maintenance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Jaya1995/Maintenance
|
[
"region:us"
] |
2023-09-05T03:07:15+00:00
|
{"dataset_info": {"features": [{"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6677, "num_examples": 100}], "download_size": 4106, "dataset_size": 6677}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T03:15:41+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "Maintenance"
More Information needed
|
[
"# Dataset Card for \"Maintenance\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"Maintenance\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"Maintenance\"\n\nMore Information needed"
] |
bd6616e7db80143258ba7481ffd15cb51140dd02
|
# Dataset Card for "tobasesentences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
wsin/tobasesentences
|
[
"region:us"
] |
2023-09-05T03:12:40+00:00
|
{"dataset_info": {"features": [{"name": "sentence", "dtype": "string"}, {"name": "base base_sentences", "dtype": "string"}, {"name": "base_sentences", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1141, "num_examples": 4}], "download_size": 4035, "dataset_size": 1141}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T03:12:44+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "tobasesentences"
More Information needed
|
[
"# Dataset Card for \"tobasesentences\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"tobasesentences\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"tobasesentences\"\n\nMore Information needed"
] |
c684c4e43e25291097139e828f28032a85171250
|
# Dataset Card for "flan2021_explanation_targets_h2ogpt-gm-oasst1-en-2048-falcon-40b-v2-GGML"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
LahiruLowe/flan2021_explanation_targets_h2ogpt-gm-oasst1-en-2048-falcon-40b-v2-GGML
|
[
"region:us"
] |
2023-09-05T03:13:20+00:00
|
{"dataset_info": {"features": [{"name": "original_index", "dtype": "int64"}, {"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "task_source", "dtype": "string"}, {"name": "task_name", "dtype": "string"}, {"name": "template_type", "dtype": "string"}, {"name": "system_message", "dtype": "string"}, {"name": "explained_targets", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 258822, "num_examples": 209}], "download_size": 0, "dataset_size": 258822}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-10T03:04:54+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "flan2021_explanation_targets_h2ogpt-gm-oasst1-en-2048-falcon-40b-v2-GGML"
More Information needed
|
[
"# Dataset Card for \"flan2021_explanation_targets_h2ogpt-gm-oasst1-en-2048-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"flan2021_explanation_targets_h2ogpt-gm-oasst1-en-2048-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
[
6,
46
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"flan2021_explanation_targets_h2ogpt-gm-oasst1-en-2048-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
d60589736cc856390b0518ccfa0453cd7c310b49
|
# Dataset Card for "LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
sachith-surge/LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2
|
[
"region:us"
] |
2023-09-05T03:39:44+00:00
|
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "llama2_status", "dtype": "string"}, {"name": "llama2_rating", "dtype": "string"}, {"name": "llama2_reason", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2241878, "num_examples": 1505}], "download_size": 1173351, "dataset_size": 2241878}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T07:01:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML"
More Information needed
|
[
"# Dataset Card for \"LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
[
6,
37
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML\"\n\nMore Information needed"
] |
ba86eb38cca1b5d46d3ef4982666a1a0229c31fd
|
# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/AIDC-ai-business/Marcoroni-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [AIDC-ai-business/Marcoroni-7B](https://huggingface.co/AIDC-ai-business/Marcoroni-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-11T14:58:05.245524](https://huggingface.co/datasets/open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B/blob/main/results_2023-09-11T14-58-05.245524.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5159772470651705,
"acc_stderr": 0.03490050368845693,
"acc_norm": 0.5196198874675843,
"acc_norm_stderr": 0.03488383911166199,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5084843623108531,
"mc2_stderr": 0.015788699144390992
},
"harness|arc:challenge|25": {
"acc": 0.5537542662116041,
"acc_stderr": 0.014526705548539982,
"acc_norm": 0.5810580204778157,
"acc_norm_stderr": 0.014418106953639013
},
"harness|hellaswag|10": {
"acc": 0.6132244572794264,
"acc_stderr": 0.004860162076330978,
"acc_norm": 0.8008364867556264,
"acc_norm_stderr": 0.0039855506403304606
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5625,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.032671518489247764,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.032671518489247764
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.044045561573747664,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.044045561573747664
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523867,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523867
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.040061680838488774,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.040061680838488774
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5612903225806452,
"acc_stderr": 0.028229497320317216,
"acc_norm": 0.5612903225806452,
"acc_norm_stderr": 0.028229497320317216
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.0331847733384533,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.0331847733384533
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7512953367875648,
"acc_stderr": 0.031195840877700286,
"acc_norm": 0.7512953367875648,
"acc_norm_stderr": 0.031195840877700286
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.02533466708095495,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.02533466708095495
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5126050420168067,
"acc_stderr": 0.03246816765752174,
"acc_norm": 0.5126050420168067,
"acc_norm_stderr": 0.03246816765752174
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7192660550458716,
"acc_stderr": 0.019266055045871623,
"acc_norm": 0.7192660550458716,
"acc_norm_stderr": 0.019266055045871623
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.03228210387037892,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.03228210387037892
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7215189873417721,
"acc_stderr": 0.029178682304842548,
"acc_norm": 0.7215189873417721,
"acc_norm_stderr": 0.029178682304842548
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.03292802819330314,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.03292802819330314
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.046166311118017125,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.046166311118017125
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112723,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.027236013946196704,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.027236013946196704
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7100893997445722,
"acc_stderr": 0.01622501794477098,
"acc_norm": 0.7100893997445722,
"acc_norm_stderr": 0.01622501794477098
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5809248554913294,
"acc_stderr": 0.02656417811142262,
"acc_norm": 0.5809248554913294,
"acc_norm_stderr": 0.02656417811142262
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.264804469273743,
"acc_stderr": 0.014756906483260664,
"acc_norm": 0.264804469273743,
"acc_norm_stderr": 0.014756906483260664
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5261437908496732,
"acc_stderr": 0.028590752958852394,
"acc_norm": 0.5261437908496732,
"acc_norm_stderr": 0.028590752958852394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.027950481494401266,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.027950481494401266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.027586006221607708,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.027586006221607708
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.38396349413298564,
"acc_stderr": 0.01242158783313423,
"acc_norm": 0.38396349413298564,
"acc_norm_stderr": 0.01242158783313423
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4889705882352941,
"acc_stderr": 0.03036544647727568,
"acc_norm": 0.4889705882352941,
"acc_norm_stderr": 0.03036544647727568
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.020220920829626912,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.020220920829626912
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661896,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661896
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.034457899643627506,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.034457899643627506
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.03528211258245229,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.03528211258245229
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5084843623108531,
"mc2_stderr": 0.015788699144390992
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B
|
[
"region:us"
] |
2023-09-05T04:01:40+00:00
|
{"pretty_name": "Evaluation run of AIDC-ai-business/Marcoroni-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [AIDC-ai-business/Marcoroni-7B](https://huggingface.co/AIDC-ai-business/Marcoroni-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B\",\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-11T14:58:05.245524](https://huggingface.co/datasets/open-llm-leaderboard/details_AIDC-ai-business__Marcoroni-7B/blob/main/results_2023-09-11T14-58-05.245524.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5159772470651705,\n \"acc_stderr\": 0.03490050368845693,\n \"acc_norm\": 0.5196198874675843,\n \"acc_norm_stderr\": 0.03488383911166199,\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5084843623108531,\n \"mc2_stderr\": 0.015788699144390992\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5537542662116041,\n \"acc_stderr\": 0.014526705548539982,\n \"acc_norm\": 0.5810580204778157,\n \"acc_norm_stderr\": 0.014418106953639013\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6132244572794264,\n \"acc_stderr\": 0.004860162076330978,\n \"acc_norm\": 0.8008364867556264,\n \"acc_norm_stderr\": 0.0039855506403304606\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.032671518489247764,\n \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.032671518489247764\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n \"acc_stderr\": 0.044045561573747664,\n \"acc_norm\": 0.32456140350877194,\n \"acc_norm_stderr\": 0.044045561573747664\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523867,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523867\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.040061680838488774,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.040061680838488774\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5612903225806452,\n \"acc_stderr\": 0.028229497320317216,\n \"acc_norm\": 0.5612903225806452,\n \"acc_norm_stderr\": 0.028229497320317216\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.0331847733384533,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.0331847733384533\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7512953367875648,\n \"acc_stderr\": 0.031195840877700286,\n \"acc_norm\": 0.7512953367875648,\n \"acc_norm_stderr\": 0.031195840877700286\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.02533466708095495,\n \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.02533466708095495\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5126050420168067,\n \"acc_stderr\": 0.03246816765752174,\n \"acc_norm\": 0.5126050420168067,\n \"acc_norm_stderr\": 0.03246816765752174\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7192660550458716,\n \"acc_stderr\": 0.019266055045871623,\n \"acc_norm\": 0.7192660550458716,\n \"acc_norm_stderr\": 0.019266055045871623\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.03228210387037892,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.03228210387037892\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7215189873417721,\n \"acc_stderr\": 0.029178682304842548,\n \"acc_norm\": 0.7215189873417721,\n \"acc_norm_stderr\": 0.029178682304842548\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n \"acc_stderr\": 0.03292802819330314,\n \"acc_norm\": 0.5964125560538116,\n \"acc_norm_stderr\": 0.03292802819330314\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112723,\n \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112723\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.027236013946196704,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.027236013946196704\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7100893997445722,\n \"acc_stderr\": 0.01622501794477098,\n \"acc_norm\": 0.7100893997445722,\n \"acc_norm_stderr\": 0.01622501794477098\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5809248554913294,\n \"acc_stderr\": 0.02656417811142262,\n \"acc_norm\": 0.5809248554913294,\n \"acc_norm_stderr\": 0.02656417811142262\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.264804469273743,\n \"acc_stderr\": 0.014756906483260664,\n \"acc_norm\": 0.264804469273743,\n \"acc_norm_stderr\": 0.014756906483260664\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5261437908496732,\n \"acc_stderr\": 0.028590752958852394,\n \"acc_norm\": 0.5261437908496732,\n \"acc_norm_stderr\": 0.028590752958852394\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n \"acc_stderr\": 0.027950481494401266,\n \"acc_norm\": 0.5884244372990354,\n \"acc_norm_stderr\": 0.027950481494401266\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5648148148148148,\n \"acc_stderr\": 0.027586006221607708,\n \"acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.027586006221607708\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.38396349413298564,\n \"acc_stderr\": 0.01242158783313423,\n \"acc_norm\": 0.38396349413298564,\n \"acc_norm_stderr\": 0.01242158783313423\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4889705882352941,\n \"acc_stderr\": 0.03036544647727568,\n \"acc_norm\": 0.4889705882352941,\n \"acc_norm_stderr\": 0.03036544647727568\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626912,\n \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626912\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661896,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661896\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n \"acc_stderr\": 0.034457899643627506,\n \"acc_norm\": 0.6119402985074627,\n \"acc_norm_stderr\": 0.034457899643627506\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.03528211258245229,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.03528211258245229\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5084843623108531,\n \"mc2_stderr\": 0.015788699144390992\n }\n}\n```", "repo_url": "https://huggingface.co/AIDC-ai-business/Marcoroni-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|arc:challenge|25_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|arc:challenge|25_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hellaswag|10_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hellaswag|10_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T05:01:15.449449.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-11T14-58-05.245524.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-11T14-58-05.245524.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T05_01_15.449449", "path": ["results_2023-09-05T05:01:15.449449.parquet"]}, {"split": "2023_09_11T14_58_05.245524", "path": ["results_2023-09-11T14-58-05.245524.parquet"]}, {"split": "latest", "path": ["results_2023-09-11T14-58-05.245524.parquet"]}]}]}
|
2023-09-11T13:59:23+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model AIDC-ai-business/Marcoroni-7B on the Open LLM Leaderboard.
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-11T14:58:05.245524(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AIDC-ai-business/Marcoroni-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-11T14:58:05.245524(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model AIDC-ai-business/Marcoroni-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-11T14:58:05.245524(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AIDC-ai-business/Marcoroni-7B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model AIDC-ai-business/Marcoroni-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-11T14:58:05.245524(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
2996d1e874543dfff369d3707948dc3929765104
|
# Dataset Card for Evaluation run of Undi95/MLewd-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/MLewd-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/MLewd-L2-13B](https://huggingface.co/Undi95/MLewd-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__MLewd-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T07:35:31.407630](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-13B/blob/main/results_2023-10-18T07-35-31.407630.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.012164429530201342,
"em_stderr": 0.0011226072817372202,
"f1": 0.09181417785234938,
"f1_stderr": 0.0019450870531667406,
"acc": 0.37384759088376845,
"acc_stderr": 0.007756725366346258
},
"harness|drop|3": {
"em": 0.012164429530201342,
"em_stderr": 0.0011226072817372202,
"f1": 0.09181417785234938,
"f1_stderr": 0.0019450870531667406
},
"harness|gsm8k|5": {
"acc": 0.01288855193328279,
"acc_stderr": 0.003106901266499655
},
"harness|winogrande|5": {
"acc": 0.7348066298342542,
"acc_stderr": 0.012406549466192861
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Undi95__MLewd-L2-13B
|
[
"region:us"
] |
2023-09-05T04:06:36+00:00
|
{"pretty_name": "Evaluation run of Undi95/MLewd-L2-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Undi95/MLewd-L2-13B](https://huggingface.co/Undi95/MLewd-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__MLewd-L2-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T07:35:31.407630](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__MLewd-L2-13B/blob/main/results_2023-10-18T07-35-31.407630.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.012164429530201342,\n \"em_stderr\": 0.0011226072817372202,\n \"f1\": 0.09181417785234938,\n \"f1_stderr\": 0.0019450870531667406,\n \"acc\": 0.37384759088376845,\n \"acc_stderr\": 0.007756725366346258\n },\n \"harness|drop|3\": {\n \"em\": 0.012164429530201342,\n \"em_stderr\": 0.0011226072817372202,\n \"f1\": 0.09181417785234938,\n \"f1_stderr\": 0.0019450870531667406\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01288855193328279,\n \"acc_stderr\": 0.003106901266499655\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7348066298342542,\n \"acc_stderr\": 0.012406549466192861\n }\n}\n```", "repo_url": "https://huggingface.co/Undi95/MLewd-L2-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|arc:challenge|25_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T07_35_31.407630", "path": ["**/details_harness|drop|3_2023-10-18T07-35-31.407630.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T07-35-31.407630.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T07_35_31.407630", "path": ["**/details_harness|gsm8k|5_2023-10-18T07-35-31.407630.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T07-35-31.407630.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hellaswag|10_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T05:06:12.728207.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T05:06:12.728207.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T07_35_31.407630", "path": ["**/details_harness|winogrande|5_2023-10-18T07-35-31.407630.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T07-35-31.407630.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T05_06_12.728207", "path": ["results_2023-09-05T05:06:12.728207.parquet"]}, {"split": "2023_10_18T07_35_31.407630", "path": ["results_2023-10-18T07-35-31.407630.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T07-35-31.407630.parquet"]}]}]}
|
2023-10-18T06:35:44+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Undi95/MLewd-L2-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Undi95/MLewd-L2-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T07:35:31.407630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Undi95/MLewd-L2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/MLewd-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T07:35:31.407630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Undi95/MLewd-L2-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/MLewd-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T07:35:31.407630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
21,
31,
169,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Undi95/MLewd-L2-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/MLewd-L2-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T07:35:31.407630(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
cb8d177b3e74c0a67c60e38084f8e9c51671779c
|
# Dataset Card for "pubmed_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
zxvix/pubmed_100
|
[
"region:us"
] |
2023-09-05T04:13:42+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "MedlineCitation", "struct": [{"name": "PMID", "dtype": "int32"}, {"name": "DateCompleted", "struct": [{"name": "Year", "dtype": "int32"}, {"name": "Month", "dtype": "int32"}, {"name": "Day", "dtype": "int32"}]}, {"name": "NumberOfReferences", "dtype": "int32"}, {"name": "DateRevised", "struct": [{"name": "Year", "dtype": "int32"}, {"name": "Month", "dtype": "int32"}, {"name": "Day", "dtype": "int32"}]}, {"name": "Article", "struct": [{"name": "Abstract", "struct": [{"name": "AbstractText", "dtype": "string"}]}, {"name": "ArticleTitle", "dtype": "string"}, {"name": "AuthorList", "struct": [{"name": "Author", "sequence": [{"name": "LastName", "dtype": "string"}, {"name": "ForeName", "dtype": "string"}, {"name": "Initials", "dtype": "string"}, {"name": "CollectiveName", "dtype": "string"}]}]}, {"name": "Language", "dtype": "string"}, {"name": "GrantList", "struct": [{"name": "Grant", "sequence": [{"name": "GrantID", "dtype": "string"}, {"name": "Agency", "dtype": "string"}, {"name": "Country", "dtype": "string"}]}]}, {"name": "PublicationTypeList", "struct": [{"name": "PublicationType", "sequence": "string"}]}]}, {"name": "MedlineJournalInfo", "struct": [{"name": "Country", "dtype": "string"}]}, {"name": "ChemicalList", "struct": [{"name": "Chemical", "sequence": [{"name": "RegistryNumber", "dtype": "string"}, {"name": "NameOfSubstance", "dtype": "string"}]}]}, {"name": "CitationSubset", "dtype": "string"}, {"name": "MeshHeadingList", "struct": [{"name": "MeshHeading", "sequence": [{"name": "DescriptorName", "dtype": "string"}, {"name": "QualifierName", "dtype": "string"}]}]}]}, {"name": "PubmedData", "struct": [{"name": "ArticleIdList", "sequence": [{"name": "ArticleId", "sequence": "string"}]}, {"name": "PublicationStatus", "dtype": "string"}, {"name": "History", "struct": [{"name": "PubMedPubDate", "sequence": [{"name": "Year", "dtype": "int32"}, {"name": "Month", "dtype": "int32"}, {"name": "Day", "dtype": "int32"}]}]}, {"name": "ReferenceList", "sequence": [{"name": "Citation", "dtype": "string"}, {"name": "CitationId", "dtype": "int32"}]}]}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 303320.4166457245, "num_examples": 100}], "download_size": 214047, "dataset_size": 303320.4166457245}}
|
2023-09-05T04:14:30+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "pubmed_100"
More Information needed
|
[
"# Dataset Card for \"pubmed_100\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"pubmed_100\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"pubmed_100\"\n\nMore Information needed"
] |
61c1a4db9e257c82f245d4722dd5adbec22d22f8
|
# Dataset Card for "04a71b5a"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/04a71b5a
|
[
"region:us"
] |
2023-09-05T04:17:17+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 184, "num_examples": 10}], "download_size": 1337, "dataset_size": 184}}
|
2023-09-05T04:17:18+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "04a71b5a"
More Information needed
|
[
"# Dataset Card for \"04a71b5a\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"04a71b5a\"\n\nMore Information needed"
] |
[
6,
16
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"04a71b5a\"\n\nMore Information needed"
] |
2d5243e4e01a2a0e08bec9a3594dbf67fed200c4
|
# Dataset Card for "pubmed_nonbiomedical_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
zxvix/pubmed_nonbiomedical_100
|
[
"region:us"
] |
2023-09-05T04:26:02+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "MedlineCitation", "struct": [{"name": "PMID", "dtype": "int32"}, {"name": "DateCompleted", "struct": [{"name": "Year", "dtype": "int32"}, {"name": "Month", "dtype": "int32"}, {"name": "Day", "dtype": "int32"}]}, {"name": "NumberOfReferences", "dtype": "int32"}, {"name": "DateRevised", "struct": [{"name": "Year", "dtype": "int32"}, {"name": "Month", "dtype": "int32"}, {"name": "Day", "dtype": "int32"}]}, {"name": "Article", "struct": [{"name": "Abstract", "struct": [{"name": "AbstractText", "dtype": "string"}]}, {"name": "ArticleTitle", "dtype": "string"}, {"name": "AuthorList", "struct": [{"name": "Author", "sequence": [{"name": "LastName", "dtype": "string"}, {"name": "ForeName", "dtype": "string"}, {"name": "Initials", "dtype": "string"}, {"name": "CollectiveName", "dtype": "string"}]}]}, {"name": "Language", "dtype": "string"}, {"name": "GrantList", "struct": [{"name": "Grant", "sequence": [{"name": "GrantID", "dtype": "string"}, {"name": "Agency", "dtype": "string"}, {"name": "Country", "dtype": "string"}]}]}, {"name": "PublicationTypeList", "struct": [{"name": "PublicationType", "sequence": "string"}]}]}, {"name": "MedlineJournalInfo", "struct": [{"name": "Country", "dtype": "string"}]}, {"name": "ChemicalList", "struct": [{"name": "Chemical", "sequence": [{"name": "RegistryNumber", "dtype": "string"}, {"name": "NameOfSubstance", "dtype": "string"}]}]}, {"name": "CitationSubset", "dtype": "string"}, {"name": "MeshHeadingList", "struct": [{"name": "MeshHeading", "sequence": [{"name": "DescriptorName", "dtype": "string"}, {"name": "QualifierName", "dtype": "string"}]}]}]}, {"name": "PubmedData", "struct": [{"name": "ArticleIdList", "sequence": [{"name": "ArticleId", "sequence": "string"}]}, {"name": "PublicationStatus", "dtype": "string"}, {"name": "History", "struct": [{"name": "PubMedPubDate", "sequence": [{"name": "Year", "dtype": "int32"}, {"name": "Month", "dtype": "int32"}, {"name": "Day", "dtype": "int32"}]}]}, {"name": "ReferenceList", "sequence": [{"name": "Citation", "dtype": "string"}, {"name": "CitationId", "dtype": "int32"}]}]}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "original_text", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 412796.0, "num_examples": 100}], "download_size": 281974, "dataset_size": 412796.0}}
|
2023-09-05T04:26:07+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "pubmed_nonbiomedical_100"
More Information needed
|
[
"# Dataset Card for \"pubmed_nonbiomedical_100\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"pubmed_nonbiomedical_100\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"pubmed_nonbiomedical_100\"\n\nMore Information needed"
] |
c3d6080bd92545f9cb4c4bd9092eeff0ca612d9d
|
# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B-PIPPA
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/ReMM-L2-13B-PIPPA
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/ReMM-L2-13B-PIPPA](https://huggingface.co/Undi95/ReMM-L2-13B-PIPPA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-15T22:47:55.884527](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA/blob/main/results_2023-10-15T22-47-55.884527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3598993288590604,
"em_stderr": 0.004915348455608255,
"f1": 0.4368917785234919,
"f1_stderr": 0.004726186762311207,
"acc": 0.3873174710218511,
"acc_stderr": 0.008457350051798611
},
"harness|drop|3": {
"em": 0.3598993288590604,
"em_stderr": 0.004915348455608255,
"f1": 0.4368917785234919,
"f1_stderr": 0.004726186762311207
},
"harness|gsm8k|5": {
"acc": 0.029567854435178165,
"acc_stderr": 0.004665893134220799
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA
|
[
"region:us"
] |
2023-09-05T04:30:14+00:00
|
{"pretty_name": "Evaluation run of Undi95/ReMM-L2-13B-PIPPA", "dataset_summary": "Dataset automatically created during the evaluation run of model [Undi95/ReMM-L2-13B-PIPPA](https://huggingface.co/Undi95/ReMM-L2-13B-PIPPA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-15T22:47:55.884527](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B-PIPPA/blob/main/results_2023-10-15T22-47-55.884527.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3598993288590604,\n \"em_stderr\": 0.004915348455608255,\n \"f1\": 0.4368917785234919,\n \"f1_stderr\": 0.004726186762311207,\n \"acc\": 0.3873174710218511,\n \"acc_stderr\": 0.008457350051798611\n },\n \"harness|drop|3\": {\n \"em\": 0.3598993288590604,\n \"em_stderr\": 0.004915348455608255,\n \"f1\": 0.4368917785234919,\n \"f1_stderr\": 0.004726186762311207\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.029567854435178165,\n \"acc_stderr\": 0.004665893134220799\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n }\n}\n```", "repo_url": "https://huggingface.co/Undi95/ReMM-L2-13B-PIPPA", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|arc:challenge|25_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_15T22_47_55.884527", "path": ["**/details_harness|drop|3_2023-10-15T22-47-55.884527.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-15T22-47-55.884527.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_15T22_47_55.884527", "path": ["**/details_harness|gsm8k|5_2023-10-15T22-47-55.884527.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-15T22-47-55.884527.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hellaswag|10_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T05:29:49.738166.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T05:29:49.738166.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T05:29:49.738166.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_15T22_47_55.884527", "path": ["**/details_harness|winogrande|5_2023-10-15T22-47-55.884527.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-15T22-47-55.884527.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T05_29_49.738166", "path": ["results_2023-09-05T05:29:49.738166.parquet"]}, {"split": "2023_10_15T22_47_55.884527", "path": ["results_2023-10-15T22-47-55.884527.parquet"]}, {"split": "latest", "path": ["results_2023-10-15T22-47-55.884527.parquet"]}]}]}
|
2023-10-15T21:48:07+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B-PIPPA
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Undi95/ReMM-L2-13B-PIPPA on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-15T22:47:55.884527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B-PIPPA",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/ReMM-L2-13B-PIPPA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T22:47:55.884527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B-PIPPA",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/ReMM-L2-13B-PIPPA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-15T22:47:55.884527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
23,
31,
171,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B-PIPPA## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Undi95/ReMM-L2-13B-PIPPA on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-15T22:47:55.884527(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e3c3ad09f32a26c533eb87e1c30bb794beac8727
|
# Dataset Card for "DialogueActPairing_DailyTalk"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
DynamicSuperb/DialogueActPairing_DailyTalk
|
[
"region:us"
] |
2023-09-05T04:39:00+00:00
|
{"dataset_info": {"features": [{"name": "file", "dtype": "string"}, {"name": "audio", "dtype": "audio"}, {"name": "file2", "dtype": "string"}, {"name": "audio2", "dtype": "audio"}, {"name": "instruction", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 1146410031.0, "num_examples": 2000}], "download_size": 988425921, "dataset_size": 1146410031.0}}
|
2023-11-01T08:35:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "DialogueActPairing_DailyTalk"
More Information needed
|
[
"# Dataset Card for \"DialogueActPairing_DailyTalk\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"DialogueActPairing_DailyTalk\"\n\nMore Information needed"
] |
[
6,
21
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"DialogueActPairing_DailyTalk\"\n\nMore Information needed"
] |
d2ad735251aec882de5c8a1026b153646c79de7c
|
# Dataset Card for "chart-to-table-mix"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
chiragtubakad/chart-to-table-mix
|
[
"region:us"
] |
2023-09-05T04:47:46+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 102169807.41570717, "num_examples": 2245}, {"name": "test", "num_bytes": 25042009.85429284, "num_examples": 562}], "download_size": 108880031, "dataset_size": 127211817.27000001}}
|
2023-09-05T04:48:07+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "chart-to-table-mix"
More Information needed
|
[
"# Dataset Card for \"chart-to-table-mix\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"chart-to-table-mix\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"chart-to-table-mix\"\n\nMore Information needed"
] |
838ba3133eb5053bfdd2d5b7c96247ba0e5044c5
|
# Dataset Card for "AsosoftWhisperv2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
abdulhade/AsosoftWhisperv2
|
[
"region:us"
] |
2023-09-05T04:51:55+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "input_features", "sequence": {"sequence": "float32"}}, {"name": "labels", "sequence": "int64"}], "splits": [{"name": "train", "num_bytes": 28038699208, "num_examples": 29188}], "download_size": 4307818668, "dataset_size": 28038699208}}
|
2023-09-05T08:02:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "AsosoftWhisperv2"
More Information needed
|
[
"# Dataset Card for \"AsosoftWhisperv2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"AsosoftWhisperv2\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"AsosoftWhisperv2\"\n\nMore Information needed"
] |
794a47b489e502da1793333064b0b6e962a23c6d
|
Data prepared for training llama2 model
Data such that to differentiate different types of charts based on X axis and Y axis
|
BlahBlah1/Datavisualisation
|
[
"license:apache-2.0",
"region:us"
] |
2023-09-05T05:00:44+00:00
|
{"license": "apache-2.0"}
|
2023-09-05T06:15:18+00:00
|
[] |
[] |
TAGS
#license-apache-2.0 #region-us
|
Data prepared for training llama2 model
Data such that to differentiate different types of charts based on X axis and Y axis
|
[] |
[
"TAGS\n#license-apache-2.0 #region-us \n"
] |
[
14
] |
[
"passage: TAGS\n#license-apache-2.0 #region-us \n"
] |
5cdbbb5eec1f811b93df5a5654149312fc1f5831
|
# Dataset Card for "squad_instruction_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
tyzhu/squad_instruction_v1
|
[
"region:us"
] |
2023-09-05T05:24:01+00:00
|
{"dataset_info": {"features": [{"name": "inputs", "dtype": "string"}, {"name": "targets", "dtype": "string"}, {"name": "question", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "answers", "struct": [{"name": "answer_start", "sequence": "int64"}, {"name": "text", "sequence": "string"}]}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1240694304, "num_examples": 700792}, {"name": "validation", "num_bytes": 159695683, "num_examples": 84560}], "download_size": 89672897, "dataset_size": 1400389987}}
|
2023-09-12T16:03:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "squad_instruction_v1"
More Information needed
|
[
"# Dataset Card for \"squad_instruction_v1\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"squad_instruction_v1\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"squad_instruction_v1\"\n\nMore Information needed"
] |
3b4431663418299d99ad9b82d5a493a3acecbaa9
|
# Dataset Card for ESG-Prospectus-Clarity-Category
### Dataset Summary
This dataset is manually annotated quality training dataset of 1155 ESG language instances (4 classes), obtained via a data extraction pipeline from summary prospectuses of sustainable (ESG) funds.
The ESG sentences extracted from ‘Principal Investment Strategy’ sections of the documents. Following are the four classes.
1. Specific ESG Language
2. Ambiguous ESG Language
3. Generic ESG language
4. Risk ESG language
All the instances are related to ESG investment language present in prospectus of funds. Further all instances were annotated for language clarity classes.
### Supported Tasks and Leaderboards
Text Classification (Language style classification)
Few Shot Classification
### Languages
English
## Dataset Structure
### Data Instances
Total instances: 1155
classwise instances:
'Specific ESG': 320
'Ambiguous ESG': 283
'Generic ESG': 264
'Risk ESG': 288
### Data Fields
```
{ "Text": "The Sub-fund's weighted carbon footprint score is equal or better than that of the Custom Bloomberg Climate Transition Benchmark.",
"Label": "specific"
"Text": "The Sub-fund invests a minimum of 5% in green, social, sustainable, and/or sustainability-linked bonds.",
"Label": "specific"
"Text": "The Fund will seek to invest in companies with sustainable business models which have a strong consideration for ESG risks and opportunities.",
"Label": "ambiguous"
}
```
### Data Splits
There's no train/validation/test split.
However the dataset is available two level of categorizations:
`esg-prospectus-clarity-category.csv`: Number of classes: 4 ('specific', 'ambiguous', 'generic', 'risk')
`esg-prospectus-clarity-granular-category.csv`: Number of classes: 7 ('specific', 'ambiguous', 'generic', 'general-risk', 'performance-risk', 'data-risk', 'disclaimer-risk')
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
The process begins with downloading the public ‘Summary Prospectuses’ from literature sections of the official websites of various Asset Management Companies (AMCs).
We collected approximately 250 sustainable products prospectuses.
#### Who are the source language producers?
The source data was written and published by various fund issuers (Asset Management Companies).
### Annotations
#### Annotation process
The dataset was divided into three subsets and each annotator was allocated 2 subset of sentences and was given few weeks to label the sentences.
Consequently, each of the 1155 instances was annotated by 2 annotators. We release standard dataset of sentences after 100% agreement.
#### Who are the annotators?
The open-sourced dataset was annotated by 3 people with adequate knowledge of ESG investing and were fluent in English with previous exposure of analyzing financial documents.
## Considerations for Using the Data
The dataset can be used to investigate the transparency in sustainability intention of language mentioned in ESG disclosures of sustainable funds.
### Discussion of Biases
The data instances might cover languages from certain fund issuers (not all). It was extracted from randomly chosen prospectuses from the collected corpus.
The dataset might be revised with broader coverage of prospectus language in future.
### Licensing Information
This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 Unported License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/3.0/.
If you are interested in commercial use of the data, please contact the following author for an appropriate license:
- [Abhijeet Kumar](mailto:[email protected])
### Citation Information
[More Information Needed]
### Contributions
Thanks to [Nazia Nafis](https://www.linkedin.com/in/nazianafis/) and [Mayank Singh](https://www.linkedin.com/in/mayank-singh-43761b155/) for contributing to the dataset creation process.
Any contribution or further research by the community are welcome.
|
Abhijeet3922/ESG-Prospectus-Clarity-Category
|
[
"task_categories:text-classification",
"task_categories:zero-shot-classification",
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-nc-sa-4.0",
"finance",
"region:us"
] |
2023-09-05T05:24:55+00:00
|
{"language": ["en"], "license": "cc-by-nc-sa-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["text-classification", "zero-shot-classification"], "tags": ["finance"], "configs": [{"config_name": "esg-prospectus-clarity-category", "data_files": "esg-prospectus-clarity-category.csv"}, {"config_name": "esg-prospectus-clarity-granular-category", "data_files": "esg-prospectus-clarity-granular-category.csv"}]}
|
2023-09-06T03:55:07+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-text-classification #task_categories-zero-shot-classification #size_categories-1K<n<10K #language-English #license-cc-by-nc-sa-4.0 #finance #region-us
|
# Dataset Card for ESG-Prospectus-Clarity-Category
### Dataset Summary
This dataset is manually annotated quality training dataset of 1155 ESG language instances (4 classes), obtained via a data extraction pipeline from summary prospectuses of sustainable (ESG) funds.
The ESG sentences extracted from ‘Principal Investment Strategy’ sections of the documents. Following are the four classes.
1. Specific ESG Language
2. Ambiguous ESG Language
3. Generic ESG language
4. Risk ESG language
All the instances are related to ESG investment language present in prospectus of funds. Further all instances were annotated for language clarity classes.
### Supported Tasks and Leaderboards
Text Classification (Language style classification)
Few Shot Classification
### Languages
English
## Dataset Structure
### Data Instances
Total instances: 1155
classwise instances:
'Specific ESG': 320
'Ambiguous ESG': 283
'Generic ESG': 264
'Risk ESG': 288
### Data Fields
### Data Splits
There's no train/validation/test split.
However the dataset is available two level of categorizations:
'URL': Number of classes: 4 ('specific', 'ambiguous', 'generic', 'risk')
'URL': Number of classes: 7 ('specific', 'ambiguous', 'generic', 'general-risk', 'performance-risk', 'data-risk', 'disclaimer-risk')
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
The process begins with downloading the public ‘Summary Prospectuses’ from literature sections of the official websites of various Asset Management Companies (AMCs).
We collected approximately 250 sustainable products prospectuses.
#### Who are the source language producers?
The source data was written and published by various fund issuers (Asset Management Companies).
### Annotations
#### Annotation process
The dataset was divided into three subsets and each annotator was allocated 2 subset of sentences and was given few weeks to label the sentences.
Consequently, each of the 1155 instances was annotated by 2 annotators. We release standard dataset of sentences after 100% agreement.
#### Who are the annotators?
The open-sourced dataset was annotated by 3 people with adequate knowledge of ESG investing and were fluent in English with previous exposure of analyzing financial documents.
## Considerations for Using the Data
The dataset can be used to investigate the transparency in sustainability intention of language mentioned in ESG disclosures of sustainable funds.
### Discussion of Biases
The data instances might cover languages from certain fund issuers (not all). It was extracted from randomly chosen prospectuses from the collected corpus.
The dataset might be revised with broader coverage of prospectus language in future.
### Licensing Information
This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 Unported License. To view a copy of this license, visit URL
If you are interested in commercial use of the data, please contact the following author for an appropriate license:
- Abhijeet Kumar
### Contributions
Thanks to Nazia Nafis and Mayank Singh for contributing to the dataset creation process.
Any contribution or further research by the community are welcome.
|
[
"# Dataset Card for ESG-Prospectus-Clarity-Category",
"### Dataset Summary\n\nThis dataset is manually annotated quality training dataset of 1155 ESG language instances (4 classes), obtained via a data extraction pipeline from summary prospectuses of sustainable (ESG) funds.\nThe ESG sentences extracted from ‘Principal Investment Strategy’ sections of the documents. Following are the four classes.\n1. Specific ESG Language\n2. Ambiguous ESG Language\n3. Generic ESG language\n4. Risk ESG language\n\nAll the instances are related to ESG investment language present in prospectus of funds. Further all instances were annotated for language clarity classes.",
"### Supported Tasks and Leaderboards\n\nText Classification (Language style classification)\nFew Shot Classification",
"### Languages\n\nEnglish",
"## Dataset Structure",
"### Data Instances\n\nTotal instances: 1155\n\nclasswise instances:\n\n'Specific ESG': 320\n'Ambiguous ESG': 283\n'Generic ESG': 264\n'Risk ESG': 288",
"### Data Fields",
"### Data Splits\n\nThere's no train/validation/test split.\n\nHowever the dataset is available two level of categorizations:\n\n'URL': Number of classes: 4 ('specific', 'ambiguous', 'generic', 'risk')\n'URL': Number of classes: 7 ('specific', 'ambiguous', 'generic', 'general-risk', 'performance-risk', 'data-risk', 'disclaimer-risk')",
"## Dataset Creation",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nThe process begins with downloading the public ‘Summary Prospectuses’ from literature sections of the official websites of various Asset Management Companies (AMCs). \nWe collected approximately 250 sustainable products prospectuses.",
"#### Who are the source language producers?\n\nThe source data was written and published by various fund issuers (Asset Management Companies).",
"### Annotations",
"#### Annotation process\n\nThe dataset was divided into three subsets and each annotator was allocated 2 subset of sentences and was given few weeks to label the sentences. \nConsequently, each of the 1155 instances was annotated by 2 annotators. We release standard dataset of sentences after 100% agreement.",
"#### Who are the annotators?\n\nThe open-sourced dataset was annotated by 3 people with adequate knowledge of ESG investing and were fluent in English with previous exposure of analyzing financial documents.",
"## Considerations for Using the Data\n\nThe dataset can be used to investigate the transparency in sustainability intention of language mentioned in ESG disclosures of sustainable funds.",
"### Discussion of Biases\n\nThe data instances might cover languages from certain fund issuers (not all). It was extracted from randomly chosen prospectuses from the collected corpus. \nThe dataset might be revised with broader coverage of prospectus language in future.",
"### Licensing Information\n\nThis work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 Unported License. To view a copy of this license, visit URL\n\nIf you are interested in commercial use of the data, please contact the following author for an appropriate license:\n- Abhijeet Kumar",
"### Contributions\n\nThanks to Nazia Nafis and Mayank Singh for contributing to the dataset creation process.\n\nAny contribution or further research by the community are welcome."
] |
[
"TAGS\n#task_categories-text-classification #task_categories-zero-shot-classification #size_categories-1K<n<10K #language-English #license-cc-by-nc-sa-4.0 #finance #region-us \n",
"# Dataset Card for ESG-Prospectus-Clarity-Category",
"### Dataset Summary\n\nThis dataset is manually annotated quality training dataset of 1155 ESG language instances (4 classes), obtained via a data extraction pipeline from summary prospectuses of sustainable (ESG) funds.\nThe ESG sentences extracted from ‘Principal Investment Strategy’ sections of the documents. Following are the four classes.\n1. Specific ESG Language\n2. Ambiguous ESG Language\n3. Generic ESG language\n4. Risk ESG language\n\nAll the instances are related to ESG investment language present in prospectus of funds. Further all instances were annotated for language clarity classes.",
"### Supported Tasks and Leaderboards\n\nText Classification (Language style classification)\nFew Shot Classification",
"### Languages\n\nEnglish",
"## Dataset Structure",
"### Data Instances\n\nTotal instances: 1155\n\nclasswise instances:\n\n'Specific ESG': 320\n'Ambiguous ESG': 283\n'Generic ESG': 264\n'Risk ESG': 288",
"### Data Fields",
"### Data Splits\n\nThere's no train/validation/test split.\n\nHowever the dataset is available two level of categorizations:\n\n'URL': Number of classes: 4 ('specific', 'ambiguous', 'generic', 'risk')\n'URL': Number of classes: 7 ('specific', 'ambiguous', 'generic', 'general-risk', 'performance-risk', 'data-risk', 'disclaimer-risk')",
"## Dataset Creation",
"### Source Data",
"#### Initial Data Collection and Normalization\n\nThe process begins with downloading the public ‘Summary Prospectuses’ from literature sections of the official websites of various Asset Management Companies (AMCs). \nWe collected approximately 250 sustainable products prospectuses.",
"#### Who are the source language producers?\n\nThe source data was written and published by various fund issuers (Asset Management Companies).",
"### Annotations",
"#### Annotation process\n\nThe dataset was divided into three subsets and each annotator was allocated 2 subset of sentences and was given few weeks to label the sentences. \nConsequently, each of the 1155 instances was annotated by 2 annotators. We release standard dataset of sentences after 100% agreement.",
"#### Who are the annotators?\n\nThe open-sourced dataset was annotated by 3 people with adequate knowledge of ESG investing and were fluent in English with previous exposure of analyzing financial documents.",
"## Considerations for Using the Data\n\nThe dataset can be used to investigate the transparency in sustainability intention of language mentioned in ESG disclosures of sustainable funds.",
"### Discussion of Biases\n\nThe data instances might cover languages from certain fund issuers (not all). It was extracted from randomly chosen prospectuses from the collected corpus. \nThe dataset might be revised with broader coverage of prospectus language in future.",
"### Licensing Information\n\nThis work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 Unported License. To view a copy of this license, visit URL\n\nIf you are interested in commercial use of the data, please contact the following author for an appropriate license:\n- Abhijeet Kumar",
"### Contributions\n\nThanks to Nazia Nafis and Mayank Singh for contributing to the dataset creation process.\n\nAny contribution or further research by the community are welcome."
] |
[
62,
19,
139,
27,
5,
6,
54,
5,
113,
5,
4,
59,
30,
5,
77,
48,
39,
60,
63,
36
] |
[
"passage: TAGS\n#task_categories-text-classification #task_categories-zero-shot-classification #size_categories-1K<n<10K #language-English #license-cc-by-nc-sa-4.0 #finance #region-us \n# Dataset Card for ESG-Prospectus-Clarity-Category### Dataset Summary\n\nThis dataset is manually annotated quality training dataset of 1155 ESG language instances (4 classes), obtained via a data extraction pipeline from summary prospectuses of sustainable (ESG) funds.\nThe ESG sentences extracted from ‘Principal Investment Strategy’ sections of the documents. Following are the four classes.\n1. Specific ESG Language\n2. Ambiguous ESG Language\n3. Generic ESG language\n4. Risk ESG language\n\nAll the instances are related to ESG investment language present in prospectus of funds. Further all instances were annotated for language clarity classes.### Supported Tasks and Leaderboards\n\nText Classification (Language style classification)\nFew Shot Classification### Languages\n\nEnglish## Dataset Structure### Data Instances\n\nTotal instances: 1155\n\nclasswise instances:\n\n'Specific ESG': 320\n'Ambiguous ESG': 283\n'Generic ESG': 264\n'Risk ESG': 288### Data Fields### Data Splits\n\nThere's no train/validation/test split.\n\nHowever the dataset is available two level of categorizations:\n\n'URL': Number of classes: 4 ('specific', 'ambiguous', 'generic', 'risk')\n'URL': Number of classes: 7 ('specific', 'ambiguous', 'generic', 'general-risk', 'performance-risk', 'data-risk', 'disclaimer-risk')## Dataset Creation### Source Data#### Initial Data Collection and Normalization\n\nThe process begins with downloading the public ‘Summary Prospectuses’ from literature sections of the official websites of various Asset Management Companies (AMCs). \nWe collected approximately 250 sustainable products prospectuses."
] |
55f0a024cdfd4f6adec1a12300f9d2457c508a24
|
# Dataset Card for "slu-augmented-data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
foxxy-hm/slu-augmented-data
|
[
"region:us"
] |
2023-09-05T05:30:50+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "speech", "sequence": "float64"}, {"name": "sampling_rate", "dtype": "int64"}, {"name": "target_text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3720263051.262795, "num_examples": 7190}, {"name": "test", "num_bytes": 930324473.7372051, "num_examples": 1798}], "download_size": 2043481654, "dataset_size": 4650587525.0}}
|
2023-09-05T15:59:46+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "slu-augmented-data"
More Information needed
|
[
"# Dataset Card for \"slu-augmented-data\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"slu-augmented-data\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"slu-augmented-data\"\n\nMore Information needed"
] |
1a0f2aa8afaf711402ebd7ef07066e8d722feea6
|
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
Feanix/ljkn
|
[
"region:us"
] |
2023-09-05T05:30:55+00:00
|
{}
|
2023-09-05T15:23:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Dataset Name
## Dataset Description
- Homepage:
- Repository:
- Paper:
- Leaderboard:
- Point of Contact:
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
8,
24,
32,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e9d7d3ad2f1535aa67effe8de0bf38079673112a
|
# Dataset Card for "nordjylland-news-image-captioning"
## Dataset Description
- **Point of Contact:** [Oliver Kinch](mailto:[email protected])
- **Size of dataset:** 11 GB
### Dataset Summary
This dataset is a collection of image-caption pairs from the Danish newspaper [TV2 Nord](https://www.tv2nord.dk/).
### Supported Tasks and Leaderboards
Image captioning is the intended task for this dataset. No leaderboard is active at this point.
### Languages
The dataset is available in Danish (`da`).
## Dataset Structure
An example from the dataset looks as follows.
```
{
"file_name": "1.jpg",
"caption": "Bruno Sørensen og Poul Erik Pedersen er ofte at finde i Fyensgade Centret."
}
```
### Data Fields
- `file_name`: a `string` giving the file name of the image.
- `caption`: a `string` feature.
### Dataset Statistics
#### Number of samples
11707
#### Image sizes
All images in the dataset are in RGB format, but they exhibit varying resolutions:
- Width ranges from 73 to 11,830 pixels.
- Height ranges from 38 to 8,268 pixels.
The side length of a square image with the same number of pixels as an image with height \\(h \\) and width \\(w \\) is approximately given as
\\( x = \text{int}({{\sqrt{h \cdot w}})} \\).
Plotting the distribution of \\( x \\) gives an insight into the sizes of the images in the dataset.

#### Caption Length Distribution

## Potential Dataset Issues
- There are 14 images with the caption "Arkivfoto".
- There are 37 images with captions consisting solely of a source reference, such as "Kilde: \<name of source\>".
You might want to consider excluding these samples from the model training process.
## Dataset Creation
### Curation Rationale
There are not many large-scale image-captioning datasets in Danish.
### Source Data
The dataset has been collected through the TV2 Nord API, which can be accessed [here](https://developer.bazo.dk/#876ab6f9-e057-43e3-897a-1563de34397e).
## Additional Information
### Dataset Curators
[Oliver Kinch](https://huggingface.co/oliverkinch) from the [The Alexandra
Institute](https://alexandra.dk/)
### Licensing Information
The dataset is licensed under the [CC0
license](https://creativecommons.org/share-your-work/public-domain/cc0/).
|
alexandrainst/nordjylland-news-image-captioning
|
[
"task_categories:image-to-text",
"task_categories:zero-shot-image-classification",
"task_categories:feature-extraction",
"task_ids:image-captioning",
"size_categories:10K<n<100K",
"language:da",
"license:apache-2.0",
"region:us"
] |
2023-09-05T05:32:33+00:00
|
{"language": ["da"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["image-to-text", "zero-shot-image-classification", "feature-extraction"], "task_ids": ["image-captioning"], "pretty_name": "Nordjylland News - Image caption dataset", "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "caption", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 10341164216.808, "num_examples": 11707}], "download_size": 11002607252, "dataset_size": 10341164216.808}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-11-28T15:36:16+00:00
|
[] |
[
"da"
] |
TAGS
#task_categories-image-to-text #task_categories-zero-shot-image-classification #task_categories-feature-extraction #task_ids-image-captioning #size_categories-10K<n<100K #language-Danish #license-apache-2.0 #region-us
|
# Dataset Card for "nordjylland-news-image-captioning"
## Dataset Description
- Point of Contact: Oliver Kinch
- Size of dataset: 11 GB
### Dataset Summary
This dataset is a collection of image-caption pairs from the Danish newspaper TV2 Nord.
### Supported Tasks and Leaderboards
Image captioning is the intended task for this dataset. No leaderboard is active at this point.
### Languages
The dataset is available in Danish ('da').
## Dataset Structure
An example from the dataset looks as follows.
### Data Fields
- 'file_name': a 'string' giving the file name of the image.
- 'caption': a 'string' feature.
### Dataset Statistics
#### Number of samples
11707
#### Image sizes
All images in the dataset are in RGB format, but they exhibit varying resolutions:
- Width ranges from 73 to 11,830 pixels.
- Height ranges from 38 to 8,268 pixels.
The side length of a square image with the same number of pixels as an image with height \\(h \\) and width \\(w \\) is approximately given as
\\( x = \text{int}({{\sqrt{h \cdot w}})} \\).
Plotting the distribution of \\( x \\) gives an insight into the sizes of the images in the dataset.
!image_size_distribution
#### Caption Length Distribution
!caption_length_distribution.png
## Potential Dataset Issues
- There are 14 images with the caption "Arkivfoto".
- There are 37 images with captions consisting solely of a source reference, such as "Kilde: \<name of source\>".
You might want to consider excluding these samples from the model training process.
## Dataset Creation
### Curation Rationale
There are not many large-scale image-captioning datasets in Danish.
### Source Data
The dataset has been collected through the TV2 Nord API, which can be accessed here.
## Additional Information
### Dataset Curators
Oliver Kinch from the The Alexandra
Institute
### Licensing Information
The dataset is licensed under the CC0
license.
|
[
"# Dataset Card for \"nordjylland-news-image-captioning\"",
"## Dataset Description\n\n- Point of Contact: Oliver Kinch\n- Size of dataset: 11 GB",
"### Dataset Summary\n\nThis dataset is a collection of image-caption pairs from the Danish newspaper TV2 Nord.",
"### Supported Tasks and Leaderboards\n\nImage captioning is the intended task for this dataset. No leaderboard is active at this point.",
"### Languages\n\nThe dataset is available in Danish ('da').",
"## Dataset Structure\n\nAn example from the dataset looks as follows.",
"### Data Fields\n\n- 'file_name': a 'string' giving the file name of the image.\n- 'caption': a 'string' feature.",
"### Dataset Statistics",
"#### Number of samples\n\n11707",
"#### Image sizes\n\nAll images in the dataset are in RGB format, but they exhibit varying resolutions:\n\n- Width ranges from 73 to 11,830 pixels.\n\n- Height ranges from 38 to 8,268 pixels.\n\nThe side length of a square image with the same number of pixels as an image with height \\\\(h \\\\) and width \\\\(w \\\\) is approximately given as\n\n\\\\( x = \\text{int}({{\\sqrt{h \\cdot w}})} \\\\).\n\nPlotting the distribution of \\\\( x \\\\) gives an insight into the sizes of the images in the dataset.\n\n\n!image_size_distribution",
"#### Caption Length Distribution\n\n!caption_length_distribution.png",
"## Potential Dataset Issues\n- There are 14 images with the caption \"Arkivfoto\".\n\n- There are 37 images with captions consisting solely of a source reference, such as \"Kilde: \\<name of source\\>\".\n\nYou might want to consider excluding these samples from the model training process.",
"## Dataset Creation",
"### Curation Rationale\n\nThere are not many large-scale image-captioning datasets in Danish.",
"### Source Data\n\nThe dataset has been collected through the TV2 Nord API, which can be accessed here.",
"## Additional Information",
"### Dataset Curators\n\nOliver Kinch from the The Alexandra\nInstitute",
"### Licensing Information\n\nThe dataset is licensed under the CC0\nlicense."
] |
[
"TAGS\n#task_categories-image-to-text #task_categories-zero-shot-image-classification #task_categories-feature-extraction #task_ids-image-captioning #size_categories-10K<n<100K #language-Danish #license-apache-2.0 #region-us \n",
"# Dataset Card for \"nordjylland-news-image-captioning\"",
"## Dataset Description\n\n- Point of Contact: Oliver Kinch\n- Size of dataset: 11 GB",
"### Dataset Summary\n\nThis dataset is a collection of image-caption pairs from the Danish newspaper TV2 Nord.",
"### Supported Tasks and Leaderboards\n\nImage captioning is the intended task for this dataset. No leaderboard is active at this point.",
"### Languages\n\nThe dataset is available in Danish ('da').",
"## Dataset Structure\n\nAn example from the dataset looks as follows.",
"### Data Fields\n\n- 'file_name': a 'string' giving the file name of the image.\n- 'caption': a 'string' feature.",
"### Dataset Statistics",
"#### Number of samples\n\n11707",
"#### Image sizes\n\nAll images in the dataset are in RGB format, but they exhibit varying resolutions:\n\n- Width ranges from 73 to 11,830 pixels.\n\n- Height ranges from 38 to 8,268 pixels.\n\nThe side length of a square image with the same number of pixels as an image with height \\\\(h \\\\) and width \\\\(w \\\\) is approximately given as\n\n\\\\( x = \\text{int}({{\\sqrt{h \\cdot w}})} \\\\).\n\nPlotting the distribution of \\\\( x \\\\) gives an insight into the sizes of the images in the dataset.\n\n\n!image_size_distribution",
"#### Caption Length Distribution\n\n!caption_length_distribution.png",
"## Potential Dataset Issues\n- There are 14 images with the caption \"Arkivfoto\".\n\n- There are 37 images with captions consisting solely of a source reference, such as \"Kilde: \\<name of source\\>\".\n\nYou might want to consider excluding these samples from the model training process.",
"## Dataset Creation",
"### Curation Rationale\n\nThere are not many large-scale image-captioning datasets in Danish.",
"### Source Data\n\nThe dataset has been collected through the TV2 Nord API, which can be accessed here.",
"## Additional Information",
"### Dataset Curators\n\nOliver Kinch from the The Alexandra\nInstitute",
"### Licensing Information\n\nThe dataset is licensed under the CC0\nlicense."
] |
[
81,
18,
20,
27,
31,
16,
17,
36,
6,
8,
151,
19,
68,
5,
26,
25,
5,
14,
18
] |
[
"passage: TAGS\n#task_categories-image-to-text #task_categories-zero-shot-image-classification #task_categories-feature-extraction #task_ids-image-captioning #size_categories-10K<n<100K #language-Danish #license-apache-2.0 #region-us \n# Dataset Card for \"nordjylland-news-image-captioning\"## Dataset Description\n\n- Point of Contact: Oliver Kinch\n- Size of dataset: 11 GB### Dataset Summary\n\nThis dataset is a collection of image-caption pairs from the Danish newspaper TV2 Nord.### Supported Tasks and Leaderboards\n\nImage captioning is the intended task for this dataset. No leaderboard is active at this point.### Languages\n\nThe dataset is available in Danish ('da').## Dataset Structure\n\nAn example from the dataset looks as follows.### Data Fields\n\n- 'file_name': a 'string' giving the file name of the image.\n- 'caption': a 'string' feature.### Dataset Statistics#### Number of samples\n\n11707#### Image sizes\n\nAll images in the dataset are in RGB format, but they exhibit varying resolutions:\n\n- Width ranges from 73 to 11,830 pixels.\n\n- Height ranges from 38 to 8,268 pixels.\n\nThe side length of a square image with the same number of pixels as an image with height \\\\(h \\\\) and width \\\\(w \\\\) is approximately given as\n\n\\\\( x = \\text{int}({{\\sqrt{h \\cdot w}})} \\\\).\n\nPlotting the distribution of \\\\( x \\\\) gives an insight into the sizes of the images in the dataset.\n\n\n!image_size_distribution#### Caption Length Distribution\n\n!caption_length_distribution.png## Potential Dataset Issues\n- There are 14 images with the caption \"Arkivfoto\".\n\n- There are 37 images with captions consisting solely of a source reference, such as \"Kilde: \\<name of source\\>\".\n\nYou might want to consider excluding these samples from the model training process.## Dataset Creation"
] |
ecd014fe950e3c4eb6dd0bcae018537722cf77c9
|
# Dataset Card for "miumiu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
EliKet/miumiu
|
[
"region:us"
] |
2023-09-05T05:38:52+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "image_name", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 21220034.0, "num_examples": 18}], "download_size": 21212241, "dataset_size": 21220034.0}}
|
2023-09-08T06:30:58+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "miumiu"
More Information needed
|
[
"# Dataset Card for \"miumiu\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"miumiu\"\n\nMore Information needed"
] |
[
6,
12
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"miumiu\"\n\nMore Information needed"
] |
2e2972d2cf8a27e1aa77cea28235813441477d73
|
# Dataset Card for "framed_wall_art_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Falah/framed_wall_art_prompts_SDXL
|
[
"region:us"
] |
2023-09-05T05:40:54+00:00
|
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 390982557, "num_examples": 1000000}], "download_size": 39212995, "dataset_size": 390982557}}
|
2023-09-05T05:41:04+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "framed_wall_art_prompts_SDXL"
More Information needed
|
[
"# Dataset Card for \"framed_wall_art_prompts_SDXL\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"framed_wall_art_prompts_SDXL\"\n\nMore Information needed"
] |
[
6,
23
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"framed_wall_art_prompts_SDXL\"\n\nMore Information needed"
] |
a732be10d8a08755df0a24b9c555e24db412c29e
|
# Dataset Card for "human_generator_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Falah/human_generator_prompts
|
[
"region:us"
] |
2023-09-05T06:02:56+00:00
|
{"dataset_info": {"features": [{"name": "prompts", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 483970412, "num_examples": 1000000}], "download_size": 61161249, "dataset_size": 483970412}}
|
2023-09-05T06:03:06+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "human_generator_prompts"
More Information needed
|
[
"# Dataset Card for \"human_generator_prompts\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"human_generator_prompts\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"human_generator_prompts\"\n\nMore Information needed"
] |
3170e279b94da7cec3bf49dcc63532c9c3367b18
|
A medical forms dataset containing scanned documents is a valuable resource for healthcare professionals, researchers, and institutions seeking to streamline and improve their administrative and patient care processes. This dataset comprises digitized versions of various medical forms, such as patient intake forms, consent forms, health assessment questionnaires, and more, which have been scanned for electronic storage and easy access.
These scanned medical forms preserve the layout and structure of the original paper documents, including checkboxes, text fields, and signature spaces. Researchers and healthcare organizations can leverage this dataset to develop automated data extraction solutions, electronic health record (EHR) systems, and machine learning models for tasks like form recognition, data validation, and patient data management.
Additionally, this dataset serves as a valuable training and evaluation resource for image processing and optical character recognition (OCR) algorithms, enhancing the accuracy and efficiency of document digitization efforts within the healthcare sector. With the potential to improve data accuracy, reduce administrative burdens, and enhance patient care, the medical forms dataset with scanned documents is a cornerstone for advancing healthcare data management and accessibility.
|
saurabh1896/OMR-scanned-documents
|
[
"region:us"
] |
2023-09-05T06:09:00+00:00
|
{"dataset_info": {"features": [{"name": "image", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 8217916.0, "num_examples": 36}], "download_size": 8174461, "dataset_size": 8217916.0}}
|
2023-09-05T06:10:13+00:00
|
[] |
[] |
TAGS
#region-us
|
A medical forms dataset containing scanned documents is a valuable resource for healthcare professionals, researchers, and institutions seeking to streamline and improve their administrative and patient care processes. This dataset comprises digitized versions of various medical forms, such as patient intake forms, consent forms, health assessment questionnaires, and more, which have been scanned for electronic storage and easy access.
These scanned medical forms preserve the layout and structure of the original paper documents, including checkboxes, text fields, and signature spaces. Researchers and healthcare organizations can leverage this dataset to develop automated data extraction solutions, electronic health record (EHR) systems, and machine learning models for tasks like form recognition, data validation, and patient data management.
Additionally, this dataset serves as a valuable training and evaluation resource for image processing and optical character recognition (OCR) algorithms, enhancing the accuracy and efficiency of document digitization efforts within the healthcare sector. With the potential to improve data accuracy, reduce administrative burdens, and enhance patient care, the medical forms dataset with scanned documents is a cornerstone for advancing healthcare data management and accessibility.
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
a3c8aa60213b19d6b81febfcfd6563759d6e958a
|
# Dataset Card for "SamleData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Nil007/SampleDataDocVQA
|
[
"region:us"
] |
2023-09-05T06:15:24+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "image", "dtype": "image"}, {"name": "query", "struct": [{"name": "de", "dtype": "string"}, {"name": "en", "dtype": "string"}, {"name": "es", "dtype": "string"}, {"name": "fr", "dtype": "string"}, {"name": "it", "dtype": "string"}]}, {"name": "answers", "sequence": "string"}, {"name": "words", "sequence": "string"}, {"name": "bounding_boxes", "sequence": {"sequence": "float32", "length": 4}}, {"name": "answer", "struct": [{"name": "match_score", "dtype": "float64"}, {"name": "matched_text", "dtype": "string"}, {"name": "start", "dtype": "int64"}, {"name": "text", "dtype": "string"}]}, {"name": "ground_truth", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 3504131.0, "num_examples": 10}, {"name": "test", "num_bytes": 1444850.0, "num_examples": 5}], "download_size": 2542845, "dataset_size": 4948981.0}}
|
2023-09-05T06:24:30+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "SamleData"
More Information needed
|
[
"# Dataset Card for \"SamleData\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"SamleData\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"SamleData\"\n\nMore Information needed"
] |
d61c7d3ec6da83a0118c4b0ff1afa513304132f1
|
# Dataset Card for "pubmed_nonacademic_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
zxvix/pubmed_nonacademic_100
|
[
"region:us"
] |
2023-09-05T06:15:29+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "MedlineCitation", "struct": [{"name": "Article", "struct": [{"name": "Abstract", "struct": [{"name": "AbstractText", "dtype": "string"}]}, {"name": "ArticleTitle", "dtype": "string"}, {"name": "AuthorList", "struct": [{"name": "Author", "struct": [{"name": "CollectiveName", "sequence": "string"}, {"name": "ForeName", "sequence": "string"}, {"name": "Initials", "sequence": "string"}, {"name": "LastName", "sequence": "string"}]}]}, {"name": "GrantList", "struct": [{"name": "Grant", "struct": [{"name": "Agency", "sequence": "string"}, {"name": "Country", "sequence": "string"}, {"name": "GrantID", "sequence": "string"}]}]}, {"name": "Language", "dtype": "string"}, {"name": "PublicationTypeList", "struct": [{"name": "PublicationType", "sequence": "string"}]}]}, {"name": "ChemicalList", "struct": [{"name": "Chemical", "struct": [{"name": "NameOfSubstance", "sequence": "string"}, {"name": "RegistryNumber", "sequence": "string"}]}]}, {"name": "CitationSubset", "dtype": "string"}, {"name": "DateCompleted", "struct": [{"name": "Day", "dtype": "int64"}, {"name": "Month", "dtype": "int64"}, {"name": "Year", "dtype": "int64"}]}, {"name": "DateRevised", "struct": [{"name": "Day", "dtype": "int64"}, {"name": "Month", "dtype": "int64"}, {"name": "Year", "dtype": "int64"}]}, {"name": "MedlineJournalInfo", "struct": [{"name": "Country", "dtype": "string"}]}, {"name": "MeshHeadingList", "struct": [{"name": "MeshHeading", "struct": [{"name": "DescriptorName", "sequence": "string"}, {"name": "QualifierName", "sequence": "string"}]}]}, {"name": "NumberOfReferences", "dtype": "int64"}, {"name": "PMID", "dtype": "int64"}]}, {"name": "PubmedData", "struct": [{"name": "ArticleIdList", "struct": [{"name": "ArticleId", "sequence": {"sequence": "string"}}]}, {"name": "History", "struct": [{"name": "PubMedPubDate", "struct": [{"name": "Day", "sequence": "int64"}, {"name": "Month", "sequence": "int64"}, {"name": "Year", "sequence": "int64"}]}]}, {"name": "PublicationStatus", "dtype": "string"}, {"name": "ReferenceList", "struct": [{"name": "Citation", "sequence": "null"}, {"name": "CitationId", "sequence": "null"}]}]}, {"name": "text", "dtype": "string"}, {"name": "title", "dtype": "string"}, {"name": "original_text", "dtype": "string"}], "splits": [{"name": "test", "num_bytes": 417158, "num_examples": 100}], "download_size": 282401, "dataset_size": 417158}}
|
2023-09-05T07:14:42+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "pubmed_nonacademic_100"
More Information needed
|
[
"# Dataset Card for \"pubmed_nonacademic_100\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"pubmed_nonacademic_100\"\n\nMore Information needed"
] |
[
6,
19
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"pubmed_nonacademic_100\"\n\nMore Information needed"
] |
4819848ce004529d2f2627ac565620ebfd854c5e
|
# Dataset Card for "funsd-zh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
QLM78910/funsd-zh
|
[
"region:us"
] |
2023-09-05T06:28:41+00:00
|
{"dataset_info": {"features": [{"name": "lang", "dtype": "string"}, {"name": "version", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "documents", "list": [{"name": "id", "dtype": "string"}, {"name": "uid", "dtype": "string"}, {"name": "document", "list": [{"name": "box", "sequence": "int64"}, {"name": "text", "dtype": "string"}, {"name": "label", "dtype": "string"}, {"name": "words", "list": [{"name": "box", "sequence": "int64"}, {"name": "text", "dtype": "string"}]}, {"name": "linking", "sequence": {"sequence": "int64"}}, {"name": "id", "dtype": "int64"}]}, {"name": "img", "struct": [{"name": "fname", "dtype": "string"}, {"name": "width", "dtype": "int64"}, {"name": "height", "dtype": "int64"}]}]}], "splits": [{"name": "train", "num_bytes": 4057416, "num_examples": 1}, {"name": "val", "num_bytes": 1483956, "num_examples": 1}], "download_size": 1269925, "dataset_size": 5541372}}
|
2023-09-05T06:28:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "funsd-zh"
More Information needed
|
[
"# Dataset Card for \"funsd-zh\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"funsd-zh\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"funsd-zh\"\n\nMore Information needed"
] |
b4416905ac86f3d440d6c65c5decf7d70b01642f
|
# Dataset Card for "oa-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
TinyPixel/oa-2
|
[
"region:us"
] |
2023-09-05T06:45:23+00:00
|
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 9475124, "num_examples": 8274}], "download_size": 5126342, "dataset_size": 9475124}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T08:59:20+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "oa-2"
More Information needed
|
[
"# Dataset Card for \"oa-2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"oa-2\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"oa-2\"\n\nMore Information needed"
] |
648d14d3db62d52b47f8d889ff472c715b420592
|
# Dataset Card for "pco_audio_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
warleagle/pco_audio_data
|
[
"region:us"
] |
2023-09-05T06:48:53+00:00
|
{"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}], "splits": [{"name": "train", "num_bytes": 46266636.0, "num_examples": 2}], "download_size": 46268833, "dataset_size": 46266636.0}}
|
2023-09-05T06:55:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "pco_audio_data"
More Information needed
|
[
"# Dataset Card for \"pco_audio_data\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"pco_audio_data\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"pco_audio_data\"\n\nMore Information needed"
] |
5fbf0d55bea37db22c56c5e593654f5a6ce67ec8
|
# Dataset Card for "pco_audio_data_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
warleagle/pco_audio_data_v2
|
[
"region:us"
] |
2023-09-05T06:56:05+00:00
|
{"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}], "splits": [{"name": "train", "num_bytes": 195374660.0, "num_examples": 6}], "download_size": 195380376, "dataset_size": 195374660.0}}
|
2023-09-05T06:56:35+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "pco_audio_data_v2"
More Information needed
|
[
"# Dataset Card for \"pco_audio_data_v2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"pco_audio_data_v2\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"pco_audio_data_v2\"\n\nMore Information needed"
] |
2f649ca4ba95d9ade5194d1fb8d5d1928d103ebd
|
# Dataset Card for "LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
sachith-surge/LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4
|
[
"region:us"
] |
2023-09-05T07:02:53+00:00
|
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "source", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "llama2_status", "dtype": "string"}, {"name": "llama2_rating", "dtype": "string"}, {"name": "llama2_reason", "dtype": "string"}, {"name": "gpt4_status", "dtype": "string"}, {"name": "gpt4_rating", "dtype": "string"}, {"name": "gpt4_reason", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2729018, "num_examples": 1505}], "download_size": 1378351, "dataset_size": 2729018}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T07:02:56+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4"
More Information needed
|
[
"# Dataset Card for \"LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4\"\n\nMore Information needed"
] |
[
6,
47
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4\"\n\nMore Information needed"
] |
4e78756686347a72f35aa6ec063dd446826e99a7
|
# Dataset Card for "training-data-blog-writer_v05-09-2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
talentlabs/training-data-blog-writer_v05-09-2023
|
[
"region:us"
] |
2023-09-05T07:06:13+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "title", "dtype": "string"}, {"name": "article", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 53198298, "num_examples": 10100}], "download_size": 32850622, "dataset_size": 53198298}}
|
2023-09-05T07:06:43+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "training-data-blog-writer_v05-09-2023"
More Information needed
|
[
"# Dataset Card for \"training-data-blog-writer_v05-09-2023\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"training-data-blog-writer_v05-09-2023\"\n\nMore Information needed"
] |
[
6,
23
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"training-data-blog-writer_v05-09-2023\"\n\nMore Information needed"
] |
1ba079982962f5d5523efe1f2795e1285d357cde
|
# Dataset Card for "simpsons_prompt_lines"

I used the [Simpsons](https://www.kaggle.com/datasets/prashant111/the-simpsons-dataset?resource=download&select=simpsons_episodes.csv) Kaggle dataset (simpsons_episodes.csv and simpsons_script_lines.csv)
I got the idea and part of the code from this [blog post](https://replicate.com/blog/fine-tune-llama-to-speak-like-homer-simpson) from Replicate.
This can be used to fine-tune a Chat LLM model, to speak like one of the characters of the show !
### Example
```json
{
"previous": "Marge Simpson: Homer, get up! Up, up, up!\nMarge Simpson: Oh no!\nHomer Simpson: Whuzzit... My juice box!\nMarge Simpson: Sorry, Homie, but you promised to take me to the Apron Expo today.",
"character": "Homer Simpson",
"line": "Just give me ten more hours.",
"text": "<s> [INST] Below is a script from the American animated sitcom The Simpsons. Write a response that completes Homer Simpson's last line in the conversation. \n\nMarge Simpson: Homer, get up! Up, up, up!\nMarge Simpson: Oh no!\nHomer Simpson: Whuzzit... My juice box!\nMarge Simpson: Sorry, Homie, but you promised to take me to the Apron Expo today.\nHomer Simpson: [/INST] Just give me ten more hours. </s>"
}
```
### Characters
- Homer Simpson
- Bart Simpson
- Marge Simpson
- Lisa Simpson
- C. Montgomery Burns
- Seymour Skinner
- Moe Szyslak
- Ned Flanders
- Grampa Simpson
- Krusty the Clown
- Chief Wiggum
- Milhouse Van Houten
- Waylon Smithers
- Apu Nahasapeemapetilon
- Kent Brockman
- Nelson Muntz
- Barney Gumble
- Lenny Leonard
- Edna Krabappel-Flanders
- Sideshow Bob
- Dr. Julius Hibbert
- Selma Bouvier
- Ralph Wiggum
- Rev. Timothy Lovejoy
- Crowd
- Carl Carlson
- Patty Bouvier
- Mayor Joe Quimby
- Otto Mann
- Groundskeeper Willie
- Martin Prince
- Announcer
- Comic Book Guy
- Kids
- Lionel Hutz
- HERB
- Sideshow Mel
- Gary Chalmers
- Professor Jonathan Frink
- Jimbo Jones
- Lou
- Todd Flanders
- Miss Hoover
- Agnes Skinner
- Maude Flanders
- Troy McClure
- Fat Tony
- Snake Jailbird
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
PlenitudeAI/simpsons_prompt_lines
|
[
"region:us"
] |
2023-09-05T07:18:25+00:00
|
{"dataset_info": {"features": [{"name": "previous", "dtype": "string"}, {"name": "character", "dtype": "string"}, {"name": "line", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 191013022, "num_examples": 121841}], "download_size": 0, "dataset_size": 191013022}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T07:59:33+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "simpsons_prompt_lines"
!image
I used the Simpsons Kaggle dataset (simpsons_episodes.csv and simpsons_script_lines.csv)
I got the idea and part of the code from this blog post from Replicate.
This can be used to fine-tune a Chat LLM model, to speak like one of the characters of the show !
### Example
### Characters
- Homer Simpson
- Bart Simpson
- Marge Simpson
- Lisa Simpson
- C. Montgomery Burns
- Seymour Skinner
- Moe Szyslak
- Ned Flanders
- Grampa Simpson
- Krusty the Clown
- Chief Wiggum
- Milhouse Van Houten
- Waylon Smithers
- Apu Nahasapeemapetilon
- Kent Brockman
- Nelson Muntz
- Barney Gumble
- Lenny Leonard
- Edna Krabappel-Flanders
- Sideshow Bob
- Dr. Julius Hibbert
- Selma Bouvier
- Ralph Wiggum
- Rev. Timothy Lovejoy
- Crowd
- Carl Carlson
- Patty Bouvier
- Mayor Joe Quimby
- Otto Mann
- Groundskeeper Willie
- Martin Prince
- Announcer
- Comic Book Guy
- Kids
- Lionel Hutz
- HERB
- Sideshow Mel
- Gary Chalmers
- Professor Jonathan Frink
- Jimbo Jones
- Lou
- Todd Flanders
- Miss Hoover
- Agnes Skinner
- Maude Flanders
- Troy McClure
- Fat Tony
- Snake Jailbird
More Information needed
|
[
"# Dataset Card for \"simpsons_prompt_lines\"\n\n!image\n\nI used the Simpsons Kaggle dataset (simpsons_episodes.csv and simpsons_script_lines.csv)\nI got the idea and part of the code from this blog post from Replicate.\n\nThis can be used to fine-tune a Chat LLM model, to speak like one of the characters of the show !",
"### Example",
"### Characters \n\n- Homer Simpson\n- Bart Simpson\n- Marge Simpson\n- Lisa Simpson\n- C. Montgomery Burns\n- Seymour Skinner\n- Moe Szyslak\n- Ned Flanders\n- Grampa Simpson\n- Krusty the Clown\n- Chief Wiggum\n- Milhouse Van Houten\n- Waylon Smithers\n- Apu Nahasapeemapetilon\n- Kent Brockman\n- Nelson Muntz\n- Barney Gumble\n- Lenny Leonard\n- Edna Krabappel-Flanders\n- Sideshow Bob\n- Dr. Julius Hibbert\n- Selma Bouvier\n- Ralph Wiggum\n- Rev. Timothy Lovejoy\n- Crowd\n- Carl Carlson\n- Patty Bouvier\n- Mayor Joe Quimby\n- Otto Mann\n- Groundskeeper Willie\n- Martin Prince\n- Announcer\n- Comic Book Guy\n- Kids\n- Lionel Hutz\n- HERB\n- Sideshow Mel\n- Gary Chalmers\n- Professor Jonathan Frink\n- Jimbo Jones\n- Lou\n- Todd Flanders\n- Miss Hoover\n- Agnes Skinner\n- Maude Flanders\n- Troy McClure\n- Fat Tony\n- Snake Jailbird\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"simpsons_prompt_lines\"\n\n!image\n\nI used the Simpsons Kaggle dataset (simpsons_episodes.csv and simpsons_script_lines.csv)\nI got the idea and part of the code from this blog post from Replicate.\n\nThis can be used to fine-tune a Chat LLM model, to speak like one of the characters of the show !",
"### Example",
"### Characters \n\n- Homer Simpson\n- Bart Simpson\n- Marge Simpson\n- Lisa Simpson\n- C. Montgomery Burns\n- Seymour Skinner\n- Moe Szyslak\n- Ned Flanders\n- Grampa Simpson\n- Krusty the Clown\n- Chief Wiggum\n- Milhouse Van Houten\n- Waylon Smithers\n- Apu Nahasapeemapetilon\n- Kent Brockman\n- Nelson Muntz\n- Barney Gumble\n- Lenny Leonard\n- Edna Krabappel-Flanders\n- Sideshow Bob\n- Dr. Julius Hibbert\n- Selma Bouvier\n- Ralph Wiggum\n- Rev. Timothy Lovejoy\n- Crowd\n- Carl Carlson\n- Patty Bouvier\n- Mayor Joe Quimby\n- Otto Mann\n- Groundskeeper Willie\n- Martin Prince\n- Announcer\n- Comic Book Guy\n- Kids\n- Lionel Hutz\n- HERB\n- Sideshow Mel\n- Gary Chalmers\n- Professor Jonathan Frink\n- Jimbo Jones\n- Lou\n- Todd Flanders\n- Miss Hoover\n- Agnes Skinner\n- Maude Flanders\n- Troy McClure\n- Fat Tony\n- Snake Jailbird\n\nMore Information needed"
] |
[
6,
93,
4,
236
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"simpsons_prompt_lines\"\n\n!image\n\nI used the Simpsons Kaggle dataset (simpsons_episodes.csv and simpsons_script_lines.csv)\nI got the idea and part of the code from this blog post from Replicate.\n\nThis can be used to fine-tune a Chat LLM model, to speak like one of the characters of the show !### Example### Characters \n\n- Homer Simpson\n- Bart Simpson\n- Marge Simpson\n- Lisa Simpson\n- C. Montgomery Burns\n- Seymour Skinner\n- Moe Szyslak\n- Ned Flanders\n- Grampa Simpson\n- Krusty the Clown\n- Chief Wiggum\n- Milhouse Van Houten\n- Waylon Smithers\n- Apu Nahasapeemapetilon\n- Kent Brockman\n- Nelson Muntz\n- Barney Gumble\n- Lenny Leonard\n- Edna Krabappel-Flanders\n- Sideshow Bob\n- Dr. Julius Hibbert\n- Selma Bouvier\n- Ralph Wiggum\n- Rev. Timothy Lovejoy\n- Crowd\n- Carl Carlson\n- Patty Bouvier\n- Mayor Joe Quimby\n- Otto Mann\n- Groundskeeper Willie\n- Martin Prince\n- Announcer\n- Comic Book Guy\n- Kids\n- Lionel Hutz\n- HERB\n- Sideshow Mel\n- Gary Chalmers\n- Professor Jonathan Frink\n- Jimbo Jones\n- Lou\n- Todd Flanders\n- Miss Hoover\n- Agnes Skinner\n- Maude Flanders\n- Troy McClure\n- Fat Tony\n- Snake Jailbird\n\nMore Information needed"
] |
66e11c7ddfbe98b3ca9c19880811e81a26943329
|
test
|
haes95/cs_qna_labeling
|
[
"region:us"
] |
2023-09-05T07:19:20+00:00
|
{}
|
2023-09-05T07:23:15+00:00
|
[] |
[] |
TAGS
#region-us
|
test
|
[] |
[
"TAGS\n#region-us \n"
] |
[
6
] |
[
"passage: TAGS\n#region-us \n"
] |
e5b4c9478b5bb83498aab3e403a6e22726f852aa
|
# Dataset Card for Evaluation run of tiiuae/falcon-180B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/tiiuae/falcon-180B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [tiiuae/falcon-180B](https://huggingface.co/tiiuae/falcon-180B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 66 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 32 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_tiiuae__falcon-180B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-24T10:17:51.759984](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-180B/blob/main/results_2023-10-24T10-17-51.759984.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135493806,
"f1": 0.06573301174496615,
"f1_stderr": 0.0013666874377791776,
"acc": 0.6642104078991223,
"acc_stderr": 0.011605139145295384
},
"harness|drop|3": {
"em": 0.0028313758389261743,
"em_stderr": 0.0005441551135493806,
"f1": 0.06573301174496615,
"f1_stderr": 0.0013666874377791776
},
"harness|gsm8k|5": {
"acc": 0.45943896891584535,
"acc_stderr": 0.01372709301042978
},
"harness|winogrande|5": {
"acc": 0.8689818468823993,
"acc_stderr": 0.009483185280160986
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_tiiuae__falcon-180B
|
[
"region:us"
] |
2023-09-05T07:24:35+00:00
|
{"pretty_name": "Evaluation run of tiiuae/falcon-180B", "dataset_summary": "Dataset automatically created during the evaluation run of model [tiiuae/falcon-180B](https://huggingface.co/tiiuae/falcon-180B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 66 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 32 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_tiiuae__falcon-180B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-24T10:17:51.759984](https://huggingface.co/datasets/open-llm-leaderboard/details_tiiuae__falcon-180B/blob/main/results_2023-10-24T10-17-51.759984.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135493806,\n \"f1\": 0.06573301174496615,\n \"f1_stderr\": 0.0013666874377791776,\n \"acc\": 0.6642104078991223,\n \"acc_stderr\": 0.011605139145295384\n },\n \"harness|drop|3\": {\n \"em\": 0.0028313758389261743,\n \"em_stderr\": 0.0005441551135493806,\n \"f1\": 0.06573301174496615,\n \"f1_stderr\": 0.0013666874377791776\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45943896891584535,\n \"acc_stderr\": 0.01372709301042978\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8689818468823993,\n \"acc_stderr\": 0.009483185280160986\n }\n}\n```", "repo_url": "https://huggingface.co/tiiuae/falcon-180B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|arc:challenge|25_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|arc:challenge|25_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|arc:challenge|25_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|arc:challenge|25_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|arc:challenge|25_2023-09-01T15:12:02.263774.parquet"]}, {"split": "2023_09_25T09_30_46.601936", "path": ["**/details_harness|arc:challenge|25_2023-09-25T09-30-46.601936.parquet"]}, {"split": "2023_09_25T09_42_43.006060", "path": ["**/details_harness|arc:challenge|25_2023-09-25T09-42-43.006060.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-25T09-42-43.006060.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_23T17_29_05.444286", "path": ["**/details_harness|drop|3_2023-10-23T17-29-05.444286.parquet"]}, {"split": "2023_10_24T10_17_51.759984", "path": ["**/details_harness|drop|3_2023-10-24T10-17-51.759984.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-24T10-17-51.759984.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_23T17_29_05.444286", "path": ["**/details_harness|gsm8k|5_2023-10-23T17-29-05.444286.parquet"]}, {"split": "2023_10_24T10_17_51.759984", "path": ["**/details_harness|gsm8k|5_2023-10-24T10-17-51.759984.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-24T10-17-51.759984.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hellaswag|10_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hellaswag|10_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hellaswag|10_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hellaswag|10_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hellaswag|10_2023-09-01T15:12:02.263774.parquet"]}, {"split": "2023_09_25T11_16_10.146827", "path": ["**/details_harness|hellaswag|10_2023-09-25T11-16-10.146827.parquet"]}, {"split": "2023_09_25T11_28_53.879118", "path": ["**/details_harness|hellaswag|10_2023-09-25T11-28-53.879118.parquet"]}, {"split": "2023_09_25T13_20_00.898508", "path": ["**/details_harness|hellaswag|10_2023-09-25T13-20-00.898508.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-25T13-20-00.898508.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-30T14:31:39.488381.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-30T19:27:57.090829.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T01:32:36.577851.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-management|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-virology|5_2023-08-31T12:44:38.148712.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-01T15:12:02.263774.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-management|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-virology|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:12:02.263774.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:12:02.263774.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_08_30T14_31_39.488381", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-30T14:31:39.488381.parquet"]}, {"split": "2023_08_30T19_27_57.090829", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-30T19:27:57.090829.parquet"]}, {"split": "2023_08_31T01_32_36.577851", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T01:32:36.577851.parquet"]}, {"split": "2023_08_31T12_44_38.148712", "path": ["**/details_harness|truthfulqa:mc|0_2023-08-31T12:44:38.148712.parquet"]}, {"split": "2023_09_01T15_12_02.263774", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-01T15:12:02.263774.parquet"]}, {"split": "2023_09_25T09_49_01.514206", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-25T09-49-01.514206.parquet"]}, {"split": "2023_09_25T09_57_43.547983", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-25T09-57-43.547983.parquet"]}, {"split": "2023_09_25T10_06_12.822356", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-25T10-06-12.822356.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-25T10-06-12.822356.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_23T17_29_05.444286", "path": ["**/details_harness|winogrande|5_2023-10-23T17-29-05.444286.parquet"]}, {"split": "2023_10_24T10_17_51.759984", "path": ["**/details_harness|winogrande|5_2023-10-24T10-17-51.759984.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-24T10-17-51.759984.parquet"]}]}, {"config_name": "original_mmlu_5", "data_files": [{"split": "2023_09_21T14_54_28.631498", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-21T14-54-28.631498.parquet"]}, {"split": "2023_09_21T15_14_19.361952", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-21T15-14-19.361952.parquet"]}, {"split": "2023_09_22T15_08_20.868776", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-08-20.868776.parquet"]}, {"split": "2023_09_22T15_09_58.434868", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-09-58.434868.parquet"]}, {"split": "2023_09_22T15_40_03.532661", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-40-03.532661.parquet"]}, {"split": "2023_09_22T19_13_36.680152", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-13-36.680152.parquet"]}, {"split": "2023_09_22T19_25_51.687929", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-25-51.687929.parquet"]}, {"split": "2023_09_22T19_38_30.055713", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-38-30.055713.parquet"]}, {"split": "2023_09_22T19_56_14.188877", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-56-14.188877.parquet"]}, {"split": "2023_09_22T20_44_00.745184", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T20-44-00.745184.parquet"]}, {"split": "2023_09_22T21_16_36.510313", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-16-36.510313.parquet"]}, {"split": "2023_09_22T21_30_38.663736", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-30-38.663736.parquet"]}, {"split": "2023_09_22T21_39_07.387549", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-39-07.387549.parquet"]}, {"split": "2023_09_22T21_46_48.392874", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-46-48.392874.parquet"]}, {"split": "2023_09_22T22_06_13.624503", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T22-06-13.624503.parquet"]}, {"split": "2023_09_22T22_21_06.865348", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T22-21-06.865348.parquet"]}, {"split": "2023_09_23T09_44_24.946036", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-23T09-44-24.946036.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-23T09-44-24.946036.parquet"]}]}, {"config_name": "original_mmlu_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_21T14_54_28.631498", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-21T14-54-28.631498.parquet"]}, {"split": "2023_09_21T15_14_19.361952", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-21T15-14-19.361952.parquet"]}, {"split": "2023_09_22T15_08_20.868776", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-08-20.868776.parquet"]}, {"split": "2023_09_22T15_09_58.434868", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-09-58.434868.parquet"]}, {"split": "2023_09_22T15_40_03.532661", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T15-40-03.532661.parquet"]}, {"split": "2023_09_22T19_13_36.680152", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-13-36.680152.parquet"]}, {"split": "2023_09_22T19_25_51.687929", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-25-51.687929.parquet"]}, {"split": "2023_09_22T19_38_30.055713", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-38-30.055713.parquet"]}, {"split": "2023_09_22T19_56_14.188877", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T19-56-14.188877.parquet"]}, {"split": "2023_09_22T20_44_00.745184", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T20-44-00.745184.parquet"]}, {"split": "2023_09_22T21_16_36.510313", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-16-36.510313.parquet"]}, {"split": "2023_09_22T21_30_38.663736", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-30-38.663736.parquet"]}, {"split": "2023_09_22T21_39_07.387549", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-39-07.387549.parquet"]}, {"split": "2023_09_22T21_46_48.392874", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T21-46-48.392874.parquet"]}, {"split": "2023_09_22T22_06_13.624503", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T22-06-13.624503.parquet"]}, {"split": "2023_09_22T22_21_06.865348", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-22T22-21-06.865348.parquet"]}, {"split": "2023_09_23T09_44_24.946036", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-23T09-44-24.946036.parquet"]}, {"split": "latest", "path": ["**/details_original|mmlu:high_school_government_and_politics|5_2023-09-23T09-44-24.946036.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_21T14_54_28.631498", "path": ["results_2023-09-21T14-54-28.631498.parquet"]}, {"split": "2023_09_21T15_14_19.361952", "path": ["results_2023-09-21T15-14-19.361952.parquet"]}, {"split": "2023_09_22T15_08_20.868776", "path": ["results_2023-09-22T15-08-20.868776.parquet"]}, {"split": "2023_09_22T15_09_58.434868", "path": ["results_2023-09-22T15-09-58.434868.parquet"]}, {"split": "2023_09_22T15_40_03.532661", "path": ["results_2023-09-22T15-40-03.532661.parquet"]}, {"split": "2023_09_22T19_13_36.680152", "path": ["results_2023-09-22T19-13-36.680152.parquet"]}, {"split": "2023_09_22T19_25_51.687929", "path": ["results_2023-09-22T19-25-51.687929.parquet"]}, {"split": "2023_09_22T19_38_30.055713", "path": ["results_2023-09-22T19-38-30.055713.parquet"]}, {"split": "2023_09_22T19_56_14.188877", "path": ["results_2023-09-22T19-56-14.188877.parquet"]}, {"split": "2023_09_22T20_44_00.745184", "path": ["results_2023-09-22T20-44-00.745184.parquet"]}, {"split": "2023_09_22T21_16_36.510313", "path": ["results_2023-09-22T21-16-36.510313.parquet"]}, {"split": "2023_09_22T21_30_38.663736", "path": ["results_2023-09-22T21-30-38.663736.parquet"]}, {"split": "2023_09_22T21_39_07.387549", "path": ["results_2023-09-22T21-39-07.387549.parquet"]}, {"split": "2023_09_22T21_46_48.392874", "path": ["results_2023-09-22T21-46-48.392874.parquet"]}, {"split": "2023_09_22T22_06_13.624503", "path": ["results_2023-09-22T22-06-13.624503.parquet"]}, {"split": "2023_09_22T22_21_06.865348", "path": ["results_2023-09-22T22-21-06.865348.parquet"]}, {"split": "2023_09_23T09_44_24.946036", "path": ["results_2023-09-23T09-44-24.946036.parquet"]}, {"split": "2023_09_25T09_30_46.601936", "path": ["results_2023-09-25T09-30-46.601936.parquet"]}, {"split": "2023_09_25T09_42_43.006060", "path": ["results_2023-09-25T09-42-43.006060.parquet"]}, {"split": "2023_09_25T09_49_01.514206", "path": ["results_2023-09-25T09-49-01.514206.parquet"]}, {"split": "2023_09_25T09_57_43.547983", "path": ["results_2023-09-25T09-57-43.547983.parquet"]}, {"split": "2023_09_25T10_06_12.822356", "path": ["results_2023-09-25T10-06-12.822356.parquet"]}, {"split": "2023_09_25T11_16_10.146827", "path": ["results_2023-09-25T11-16-10.146827.parquet"]}, {"split": "2023_09_25T11_28_53.879118", "path": ["results_2023-09-25T11-28-53.879118.parquet"]}, {"split": "2023_09_25T13_20_00.898508", "path": ["results_2023-09-25T13-20-00.898508.parquet"]}, {"split": "2023_10_23T17_29_05.444286", "path": ["results_2023-10-23T17-29-05.444286.parquet"]}, {"split": "2023_10_24T10_17_51.759984", "path": ["results_2023-10-24T10-17-51.759984.parquet"]}, {"split": "latest", "path": ["results_2023-10-24T10-17-51.759984.parquet"]}]}]}
|
2023-10-24T09:18:04+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of tiiuae/falcon-180B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model tiiuae/falcon-180B on the Open LLM Leaderboard.
The dataset is composed of 66 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 32 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-24T10:17:51.759984(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of tiiuae/falcon-180B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model tiiuae/falcon-180B on the Open LLM Leaderboard.\n\nThe dataset is composed of 66 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 32 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T10:17:51.759984(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of tiiuae/falcon-180B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model tiiuae/falcon-180B on the Open LLM Leaderboard.\n\nThe dataset is composed of 66 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 32 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-24T10:17:51.759984(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of tiiuae/falcon-180B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model tiiuae/falcon-180B on the Open LLM Leaderboard.\n\nThe dataset is composed of 66 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 32 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-24T10:17:51.759984(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
b1c2fb23e4adafda657d1ee8d842ddfb9dd229fd
|
# Dataset Card for "dsc_model"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
NgThVinh/dsc_model
|
[
"region:us"
] |
2023-09-05T07:26:59+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "document", "dtype": "string"}, {"name": "claim", "dtype": "string"}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 113122532.72787674, "num_examples": 132448}, {"name": "test", "num_bytes": 28281487.272123266, "num_examples": 33113}], "download_size": 89644483, "dataset_size": 141404020.0}}
|
2023-09-05T07:27:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "dsc_model"
More Information needed
|
[
"# Dataset Card for \"dsc_model\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"dsc_model\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"dsc_model\"\n\nMore Information needed"
] |
4149d381779862ec14d56fea643b411ddfea7b47
|
# Dataset Card for Evaluation run of openchat/openchat_v3.2_super
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_v3.2_super
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_v3.2_super](https://huggingface.co/openchat/openchat_v3.2_super) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_v3.2_super",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T01:02:51.015590](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.2_super/blob/main/results_2023-10-18T01-02-51.015590.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982623,
"f1": 0.058767827181208196,
"f1_stderr": 0.0013192048135182055,
"acc": 0.4471122977692914,
"acc_stderr": 0.010713437247397681
},
"harness|drop|3": {
"em": 0.0014681208053691276,
"em_stderr": 0.00039210421902982623,
"f1": 0.058767827181208196,
"f1_stderr": 0.0013192048135182055
},
"harness|gsm8k|5": {
"acc": 0.13495072024260804,
"acc_stderr": 0.009411315282571171
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224192
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_openchat__openchat_v3.2_super
|
[
"region:us"
] |
2023-09-05T07:29:13+00:00
|
{"pretty_name": "Evaluation run of openchat/openchat_v3.2_super", "dataset_summary": "Dataset automatically created during the evaluation run of model [openchat/openchat_v3.2_super](https://huggingface.co/openchat/openchat_v3.2_super) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_v3.2_super\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-18T01:02:51.015590](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.2_super/blob/main/results_2023-10-18T01-02-51.015590.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902982623,\n \"f1\": 0.058767827181208196,\n \"f1_stderr\": 0.0013192048135182055,\n \"acc\": 0.4471122977692914,\n \"acc_stderr\": 0.010713437247397681\n },\n \"harness|drop|3\": {\n \"em\": 0.0014681208053691276,\n \"em_stderr\": 0.00039210421902982623,\n \"f1\": 0.058767827181208196,\n \"f1_stderr\": 0.0013192048135182055\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.13495072024260804,\n \"acc_stderr\": 0.009411315282571171\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224192\n }\n}\n```", "repo_url": "https://huggingface.co/openchat/openchat_v3.2_super", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|arc:challenge|25_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_18T01_02_51.015590", "path": ["**/details_harness|drop|3_2023-10-18T01-02-51.015590.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-18T01-02-51.015590.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_18T01_02_51.015590", "path": ["**/details_harness|gsm8k|5_2023-10-18T01-02-51.015590.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-18T01-02-51.015590.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hellaswag|10_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T08:28:49.460161.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T08:28:49.460161.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_18T01_02_51.015590", "path": ["**/details_harness|winogrande|5_2023-10-18T01-02-51.015590.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-18T01-02-51.015590.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T08_28_49.460161", "path": ["results_2023-09-05T08:28:49.460161.parquet"]}, {"split": "2023_10_18T01_02_51.015590", "path": ["results_2023-10-18T01-02-51.015590.parquet"]}, {"split": "latest", "path": ["results_2023-10-18T01-02-51.015590.parquet"]}]}]}
|
2023-10-18T00:03:03+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of openchat/openchat_v3.2_super
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model openchat/openchat_v3.2_super on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-18T01:02:51.015590(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of openchat/openchat_v3.2_super",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v3.2_super on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T01:02:51.015590(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of openchat/openchat_v3.2_super",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v3.2_super on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-18T01:02:51.015590(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of openchat/openchat_v3.2_super## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model openchat/openchat_v3.2_super on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-18T01:02:51.015590(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
63851adb2dbec8b4839b8005aade4d09f9994463
|
# Dataset Card for "llama2_classifying_and_explainning_v5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
RikoteMaster/llama2_classifying_and_explainning_v5
|
[
"region:us"
] |
2023-09-05T07:52:05+00:00
|
{"dataset_info": {"features": [{"name": "Explanation", "dtype": "string"}, {"name": "Text_processed", "dtype": "string"}, {"name": "Emotion", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 53692144, "num_examples": 47512}], "download_size": 16909110, "dataset_size": 53692144}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T07:52:10+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "llama2_classifying_and_explainning_v5"
More Information needed
|
[
"# Dataset Card for \"llama2_classifying_and_explainning_v5\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"llama2_classifying_and_explainning_v5\"\n\nMore Information needed"
] |
[
6,
25
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"llama2_classifying_and_explainning_v5\"\n\nMore Information needed"
] |
b4fb0157e37e3b821138a76c0c28c75e25f55df2
|
# Dataset Card for "Dataset_V1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
kristinashemet/Dataset_V1
|
[
"region:us"
] |
2023-09-05T08:01:08+00:00
|
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11067905, "num_examples": 1613}], "download_size": 1064017, "dataset_size": 11067905}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-10-10T06:04:43+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "Dataset_V1"
More Information needed
|
[
"# Dataset Card for \"Dataset_V1\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"Dataset_V1\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"Dataset_V1\"\n\nMore Information needed"
] |
d2f839eb81362823eff1daaf51566ccf5e0cc7c7
|
# Dataset Card for Evaluation run of ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b](https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T11:52:44.737465](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b/blob/main/results_2023-10-22T11-52-44.737465.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626670005,
"f1": 0.05511640100671131,
"f1_stderr": 0.0012812534382648734,
"acc": 0.46045352575705806,
"acc_stderr": 0.01174889042363714
},
"harness|drop|3": {
"em": 0.0019924496644295304,
"em_stderr": 0.00045666764626670005,
"f1": 0.05511640100671131,
"f1_stderr": 0.0012812534382648734
},
"harness|gsm8k|5": {
"acc": 0.19636087945413191,
"acc_stderr": 0.010942090791564744
},
"harness|winogrande|5": {
"acc": 0.7245461720599842,
"acc_stderr": 0.012555690055709537
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b
|
[
"region:us"
] |
2023-09-05T08:02:41+00:00
|
{"pretty_name": "Evaluation run of ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b", "dataset_summary": "Dataset automatically created during the evaluation run of model [ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b](https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T11:52:44.737465](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b/blob/main/results_2023-10-22T11-52-44.737465.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626670005,\n \"f1\": 0.05511640100671131,\n \"f1_stderr\": 0.0012812534382648734,\n \"acc\": 0.46045352575705806,\n \"acc_stderr\": 0.01174889042363714\n },\n \"harness|drop|3\": {\n \"em\": 0.0019924496644295304,\n \"em_stderr\": 0.00045666764626670005,\n \"f1\": 0.05511640100671131,\n \"f1_stderr\": 0.0012812534382648734\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.19636087945413191,\n \"acc_stderr\": 0.010942090791564744\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7245461720599842,\n \"acc_stderr\": 0.012555690055709537\n }\n}\n```", "repo_url": "https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|arc:challenge|25_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T11_52_44.737465", "path": ["**/details_harness|drop|3_2023-10-22T11-52-44.737465.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T11-52-44.737465.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T11_52_44.737465", "path": ["**/details_harness|gsm8k|5_2023-10-22T11-52-44.737465.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T11-52-44.737465.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hellaswag|10_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T09:02:22.331640.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T09:02:22.331640.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T11_52_44.737465", "path": ["**/details_harness|winogrande|5_2023-10-22T11-52-44.737465.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T11-52-44.737465.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T09_02_22.331640", "path": ["results_2023-09-05T09:02:22.331640.parquet"]}, {"split": "2023_10_22T11_52_44.737465", "path": ["results_2023-10-22T11-52-44.737465.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T11-52-44.737465.parquet"]}]}]}
|
2023-10-22T10:52:57+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T11:52:44.737465(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T11:52:44.737465(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T11:52:44.737465(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
29,
31,
177,
66,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T11:52:44.737465(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
978ea04ce3caa4e9fdb25febda3632e867543851
|
# Dataset Card for "100By100BranchPNG"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
JCAI2000/100By100BranchPNG
|
[
"region:us"
] |
2023-09-05T08:13:45+00:00
|
{"dataset_info": {"features": [{"name": "pixel_values", "dtype": "image"}, {"name": "label", "dtype": "image"}], "splits": [{"name": "train", "num_bytes": 1947502.0, "num_examples": 47}], "download_size": 189123, "dataset_size": 1947502.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-06T05:03:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "100By100BranchPNG"
More Information needed
|
[
"# Dataset Card for \"100By100BranchPNG\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"100By100BranchPNG\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"100By100BranchPNG\"\n\nMore Information needed"
] |
c2f897dfc179f1ba91dc222f8407e2e51fd01262
|
# CarlaFollowLanePreviousV
This dataset contains images extracted from CARLA simulator using an expert agent to perform imitation learning.
The expert agent is the autopilot of CARLA 0.9.12. We store images corresponding to the bird-eye-view of the camera and the corresponding control commands generated
by the agent, including the previous speed.
## Dataset details
| Folder | Scenario | Examples number |
| ------------------------------- | ---------------------- | --------------------------------- |
| carla_dataset_16_11_clockwise_town_01_extreme_2 | Town01 | 1582 |
| carla_dataset_24_07_anticlockwise_town_01_extreme | Town01 | 4957 |
| carla_dataset_test_04_11_clockwise_town_01_previous_v_extreme | Town01 | 1911 |
| carla_dataset_test_31_10_anticlockwise_town_01_previous_v | Town01 | 6184 |
| carla_dataset_test_31_10_clockwise_town_01_previous_v | Town01 | 6056 |
| carla_dataset_test_04_11_anticlockwise_town_03_previous_v | Town03 | 7285 |
| carla_dataset_test_04_11_clockwise_town_03_previous_v | Town03 | 5487 |
| carla_dataset_test_04_11_anticlockwise_town_05_previous_v | Town05 | 10375 |
| carla_dataset_test_04_11_clockwise_town_05_previous_v | Town05 | 12094 |
| carla_dataset_test_04_11_anticlockwise_town_07_previous_v | Town07 | 1781 |
| carla_dataset_test_04_11_clockwise_town_07_previous_v | Town07 | 1930 |
|
sergiopaniego/CarlaFollowLanePreviousV
|
[
"license:apache-2.0",
"region:us"
] |
2023-09-05T08:23:43+00:00
|
{"license": "apache-2.0"}
|
2023-09-06T12:12:45+00:00
|
[] |
[] |
TAGS
#license-apache-2.0 #region-us
|
CarlaFollowLanePreviousV
========================
This dataset contains images extracted from CARLA simulator using an expert agent to perform imitation learning.
The expert agent is the autopilot of CARLA 0.9.12. We store images corresponding to the bird-eye-view of the camera and the corresponding control commands generated
by the agent, including the previous speed.
Dataset details
---------------
Folder: carla\_dataset\_16\_11\_clockwise\_town\_01\_extreme\_2, Scenario: Town01, Examples number: 1582
Folder: carla\_dataset\_24\_07\_anticlockwise\_town\_01\_extreme, Scenario: Town01, Examples number: 4957
Folder: carla\_dataset\_test\_04\_11\_clockwise\_town\_01\_previous\_v\_extreme, Scenario: Town01, Examples number: 1911
Folder: carla\_dataset\_test\_31\_10\_anticlockwise\_town\_01\_previous\_v, Scenario: Town01, Examples number: 6184
Folder: carla\_dataset\_test\_31\_10\_clockwise\_town\_01\_previous\_v, Scenario: Town01, Examples number: 6056
Folder: carla\_dataset\_test\_04\_11\_anticlockwise\_town\_03\_previous\_v, Scenario: Town03, Examples number: 7285
Folder: carla\_dataset\_test\_04\_11\_clockwise\_town\_03\_previous\_v, Scenario: Town03, Examples number: 5487
Folder: carla\_dataset\_test\_04\_11\_anticlockwise\_town\_05\_previous\_v, Scenario: Town05, Examples number: 10375
Folder: carla\_dataset\_test\_04\_11\_clockwise\_town\_05\_previous\_v, Scenario: Town05, Examples number: 12094
Folder: carla\_dataset\_test\_04\_11\_anticlockwise\_town\_07\_previous\_v, Scenario: Town07, Examples number: 1781
Folder: carla\_dataset\_test\_04\_11\_clockwise\_town\_07\_previous\_v, Scenario: Town07, Examples number: 1930
|
[] |
[
"TAGS\n#license-apache-2.0 #region-us \n"
] |
[
14
] |
[
"passage: TAGS\n#license-apache-2.0 #region-us \n"
] |
ec1c19a6bcde565f40c804ee7b7add3375258890
|
# Dataset Card for "pco_audio_data_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
warleagle/pco_audio_data_v3
|
[
"region:us"
] |
2023-09-05T08:32:32+00:00
|
{"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}], "splits": [{"name": "train", "num_bytes": 447042388.0, "num_examples": 19}], "download_size": 447020757, "dataset_size": 447042388.0}}
|
2023-09-05T08:34:05+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "pco_audio_data_v3"
More Information needed
|
[
"# Dataset Card for \"pco_audio_data_v3\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"pco_audio_data_v3\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"pco_audio_data_v3\"\n\nMore Information needed"
] |
cc840edfc5e3a8aea0be9e17d170427159bb50d2
|
# Dataset Card for Evaluation run of Fredithefish/Guanaco-7B-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Fredithefish/Guanaco-7B-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Fredithefish/Guanaco-7B-Uncensored](https://huggingface.co/Fredithefish/Guanaco-7B-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fredithefish__Guanaco-7B-Uncensored",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T17:19:39.610338](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-7B-Uncensored/blob/main/results_2023-10-12T17-19-39.610338.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964606556,
"f1": 0.05823930369127524,
"f1_stderr": 0.001346062439091187,
"acc": 0.38665835314476715,
"acc_stderr": 0.009009374850629389
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964606556,
"f1": 0.05823930369127524,
"f1_stderr": 0.001346062439091187
},
"harness|gsm8k|5": {
"acc": 0.04245640636846096,
"acc_stderr": 0.005553837749990045
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268733
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Fredithefish__Guanaco-7B-Uncensored
|
[
"region:us"
] |
2023-09-05T08:42:50+00:00
|
{"pretty_name": "Evaluation run of Fredithefish/Guanaco-7B-Uncensored", "dataset_summary": "Dataset automatically created during the evaluation run of model [Fredithefish/Guanaco-7B-Uncensored](https://huggingface.co/Fredithefish/Guanaco-7B-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__Guanaco-7B-Uncensored\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T17:19:39.610338](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-7B-Uncensored/blob/main/results_2023-10-12T17-19-39.610338.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606556,\n \"f1\": 0.05823930369127524,\n \"f1_stderr\": 0.001346062439091187,\n \"acc\": 0.38665835314476715,\n \"acc_stderr\": 0.009009374850629389\n },\n \"harness|drop|3\": {\n \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964606556,\n \"f1\": 0.05823930369127524,\n \"f1_stderr\": 0.001346062439091187\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04245640636846096,\n \"acc_stderr\": 0.005553837749990045\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268733\n }\n}\n```", "repo_url": "https://huggingface.co/Fredithefish/Guanaco-7B-Uncensored", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|arc:challenge|25_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T17_19_39.610338", "path": ["**/details_harness|drop|3_2023-10-12T17-19-39.610338.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T17-19-39.610338.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T17_19_39.610338", "path": ["**/details_harness|gsm8k|5_2023-10-12T17-19-39.610338.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T17-19-39.610338.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hellaswag|10_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T09:42:26.662725.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T09:42:26.662725.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T09:42:26.662725.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T17_19_39.610338", "path": ["**/details_harness|winogrande|5_2023-10-12T17-19-39.610338.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T17-19-39.610338.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T09_42_26.662725", "path": ["results_2023-09-05T09:42:26.662725.parquet"]}, {"split": "2023_10_12T17_19_39.610338", "path": ["results_2023-10-12T17-19-39.610338.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T17-19-39.610338.parquet"]}]}]}
|
2023-10-12T16:19:53+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Fredithefish/Guanaco-7B-Uncensored
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Fredithefish/Guanaco-7B-Uncensored on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-12T17:19:39.610338(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Fredithefish/Guanaco-7B-Uncensored",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/Guanaco-7B-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T17:19:39.610338(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Fredithefish/Guanaco-7B-Uncensored",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/Guanaco-7B-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T17:19:39.610338(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
22,
31,
170,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Fredithefish/Guanaco-7B-Uncensored## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Fredithefish/Guanaco-7B-Uncensored on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T17:19:39.610338(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
bd94cef7e8ee6a592aa935c06376522a35ceaf2c
|
test
|
Elliot4AI/testpatent
|
[
"task_categories:text-classification",
"size_categories:n<1K",
"language:zh",
"license:apache-2.0",
"chemistry",
"region:us"
] |
2023-09-05T08:43:18+00:00
|
{"language": ["zh"], "license": "apache-2.0", "size_categories": ["n<1K"], "task_categories": ["text-classification"], "tags": ["chemistry"]}
|
2023-09-05T08:51:49+00:00
|
[] |
[
"zh"
] |
TAGS
#task_categories-text-classification #size_categories-n<1K #language-Chinese #license-apache-2.0 #chemistry #region-us
|
test
|
[] |
[
"TAGS\n#task_categories-text-classification #size_categories-n<1K #language-Chinese #license-apache-2.0 #chemistry #region-us \n"
] |
[
44
] |
[
"passage: TAGS\n#task_categories-text-classification #size_categories-n<1K #language-Chinese #license-apache-2.0 #chemistry #region-us \n"
] |
df49158bc48136aaa260a847a1bb02b9cb22d24e
|
## Dataset Summary
This dataset collection consists of three main components, each designed to deal with different aspects of autonomous driving tasks such as lane following and obstacle avoidance. The three zip files, namely basic-dataset.zip, traffic-1-assets.zip, and traffic-5-assets.zip, serve as the foundational building blocks for generating three distinct datasets:
#### Traffic-0:
Derived from basic-dataset.zip. Contains data exclusively for lane following without the presence of other vehicles on the road.
#### Traffic-1:
Generated by combining basic-dataset.zip with traffic-1-assets.zip. Contains lane following scenarios and introduces one specific type of front vehicle.
#### Traffic-6:
Generated by combining all three zip files (basic-dataset.zip, traffic-1-assets.zip, and traffic-5-assets.zip). Offers the most complete dataset for lane following, featuring a diverse range of front vehicle types.
|
YujiroS/traffic-6
|
[
"region:us"
] |
2023-09-05T08:57:04+00:00
|
{}
|
2023-09-06T07:37:59+00:00
|
[] |
[] |
TAGS
#region-us
|
## Dataset Summary
This dataset collection consists of three main components, each designed to deal with different aspects of autonomous driving tasks such as lane following and obstacle avoidance. The three zip files, namely URL, URL, and URL, serve as the foundational building blocks for generating three distinct datasets:
#### Traffic-0:
Derived from URL. Contains data exclusively for lane following without the presence of other vehicles on the road.
#### Traffic-1:
Generated by combining URL with URL. Contains lane following scenarios and introduces one specific type of front vehicle.
#### Traffic-6:
Generated by combining all three zip files (URL, URL, and URL). Offers the most complete dataset for lane following, featuring a diverse range of front vehicle types.
|
[
"## Dataset Summary\n\nThis dataset collection consists of three main components, each designed to deal with different aspects of autonomous driving tasks such as lane following and obstacle avoidance. The three zip files, namely URL, URL, and URL, serve as the foundational building blocks for generating three distinct datasets:",
"#### Traffic-0:\nDerived from URL. Contains data exclusively for lane following without the presence of other vehicles on the road.",
"#### Traffic-1:\nGenerated by combining URL with URL. Contains lane following scenarios and introduces one specific type of front vehicle.",
"#### Traffic-6:\nGenerated by combining all three zip files (URL, URL, and URL). Offers the most complete dataset for lane following, featuring a diverse range of front vehicle types."
] |
[
"TAGS\n#region-us \n",
"## Dataset Summary\n\nThis dataset collection consists of three main components, each designed to deal with different aspects of autonomous driving tasks such as lane following and obstacle avoidance. The three zip files, namely URL, URL, and URL, serve as the foundational building blocks for generating three distinct datasets:",
"#### Traffic-0:\nDerived from URL. Contains data exclusively for lane following without the presence of other vehicles on the road.",
"#### Traffic-1:\nGenerated by combining URL with URL. Contains lane following scenarios and introduces one specific type of front vehicle.",
"#### Traffic-6:\nGenerated by combining all three zip files (URL, URL, and URL). Offers the most complete dataset for lane following, featuring a diverse range of front vehicle types."
] |
[
6,
72,
30,
32,
44
] |
[
"passage: TAGS\n#region-us \n## Dataset Summary\n\nThis dataset collection consists of three main components, each designed to deal with different aspects of autonomous driving tasks such as lane following and obstacle avoidance. The three zip files, namely URL, URL, and URL, serve as the foundational building blocks for generating three distinct datasets:#### Traffic-0:\nDerived from URL. Contains data exclusively for lane following without the presence of other vehicles on the road.#### Traffic-1:\nGenerated by combining URL with URL. Contains lane following scenarios and introduces one specific type of front vehicle.#### Traffic-6:\nGenerated by combining all three zip files (URL, URL, and URL). Offers the most complete dataset for lane following, featuring a diverse range of front vehicle types."
] |
f4bc67fbdbf9df31ec56cf4736a4b869596f5bae
|
# Dataset Card for "tiny_shakespeare_dialogue"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
pierre-pessarossi/tiny_shakespeare_dialogue
|
[
"region:us"
] |
2023-09-05T08:59:45+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2798654, "num_examples": 6281}, {"name": "validation", "num_bytes": 166728, "num_examples": 439}, {"name": "test", "num_bytes": 115868, "num_examples": 498}], "download_size": 957486, "dataset_size": 3081250}}
|
2023-09-05T08:59:52+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "tiny_shakespeare_dialogue"
More Information needed
|
[
"# Dataset Card for \"tiny_shakespeare_dialogue\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"tiny_shakespeare_dialogue\"\n\nMore Information needed"
] |
[
6,
20
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"tiny_shakespeare_dialogue\"\n\nMore Information needed"
] |
a231d8e20b2d8685afe1f34bcf3ea3dca43604fe
|
# Dataset Card for "bnf_gallica"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
manu/bnf_gallica
|
[
"region:us"
] |
2023-09-05T09:02:14+00:00
|
{"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2628706901, "num_examples": 5907}], "download_size": 1521206509, "dataset_size": 2628706901}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T09:04:11+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "bnf_gallica"
More Information needed
|
[
"# Dataset Card for \"bnf_gallica\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"bnf_gallica\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"bnf_gallica\"\n\nMore Information needed"
] |
a691c4e91f5290777081fa3af26b2c750190e602
|
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w](https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-12T14:49:01.591870](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w/blob/main/results_2023-10-12T14-49-01.591870.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436388,
"f1": 0.06228817114093964,
"f1_stderr": 0.0014101371508567083,
"acc": 0.4173053896873633,
"acc_stderr": 0.009418776710625477
},
"harness|drop|3": {
"em": 0.003145973154362416,
"em_stderr": 0.0005734993648436388,
"f1": 0.06228817114093964,
"f1_stderr": 0.0014101371508567083
},
"harness|gsm8k|5": {
"acc": 0.06823351023502654,
"acc_stderr": 0.006945358944067431
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w
|
[
"region:us"
] |
2023-09-05T09:13:36+00:00
|
{"pretty_name": "Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w", "dataset_summary": "Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w](https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-12T14:49:01.591870](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w/blob/main/results_2023-10-12T14-49-01.591870.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436388,\n \"f1\": 0.06228817114093964,\n \"f1_stderr\": 0.0014101371508567083,\n \"acc\": 0.4173053896873633,\n \"acc_stderr\": 0.009418776710625477\n },\n \"harness|drop|3\": {\n \"em\": 0.003145973154362416,\n \"em_stderr\": 0.0005734993648436388,\n \"f1\": 0.06228817114093964,\n \"f1_stderr\": 0.0014101371508567083\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06823351023502654,\n \"acc_stderr\": 0.006945358944067431\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n }\n}\n```", "repo_url": "https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|arc:challenge|25_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_12T14_49_01.591870", "path": ["**/details_harness|drop|3_2023-10-12T14-49-01.591870.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-12T14-49-01.591870.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_12T14_49_01.591870", "path": ["**/details_harness|gsm8k|5_2023-10-12T14-49-01.591870.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-12T14-49-01.591870.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hellaswag|10_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T10:13:11.603787.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T10:13:11.603787.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_12T14_49_01.591870", "path": ["**/details_harness|winogrande|5_2023-10-12T14-49-01.591870.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-12T14-49-01.591870.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T10_13_11.603787", "path": ["results_2023-09-05T10:13:11.603787.parquet"]}, {"split": "2023_10_12T14_49_01.591870", "path": ["results_2023-10-12T14-49-01.591870.parquet"]}, {"split": "latest", "path": ["results_2023-10-12T14-49-01.591870.parquet"]}]}]}
|
2023-10-12T13:49:14+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-12T14:49:01.591870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T14:49:01.591870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-12T14:49:01.591870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
34,
31,
182,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-12T14:49:01.591870(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e4a8d1c613b420fcb3134f6c2f43c0e7208572f5
|
# Dataset Card for "code_prompt_evol"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
pvduy/code_prompt_evol
|
[
"region:us"
] |
2023-09-05T09:20:51+00:00
|
{"dataset_info": {"features": [{"name": "instruction", "dtype": "string"}, {"name": "source", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 89055932, "num_examples": 408974}], "download_size": 40713166, "dataset_size": 89055932}}
|
2023-09-05T09:20:55+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "code_prompt_evol"
More Information needed
|
[
"# Dataset Card for \"code_prompt_evol\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"code_prompt_evol\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"code_prompt_evol\"\n\nMore Information needed"
] |
9c28ed4b7c87b720f030b5996ef6e7789d7f4a8a
|
# Dataset Card for "lrs3-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
mattymchen/lrs3-test
|
[
"region:us"
] |
2023-09-05T09:34:50+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "idx", "dtype": "int64"}, {"name": "audio", "sequence": "int16"}, {"name": "video", "sequence": {"sequence": {"sequence": "uint8"}}}, {"name": "label", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 824374107, "num_examples": 1321}], "download_size": 677311360, "dataset_size": 824374107}}
|
2023-09-05T09:37:16+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "lrs3-test"
More Information needed
|
[
"# Dataset Card for \"lrs3-test\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"lrs3-test\"\n\nMore Information needed"
] |
[
6,
14
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"lrs3-test\"\n\nMore Information needed"
] |
9e4ad09673c25ba433d62891f1bec7b49cad5c0a
|
# Smart Contracts Instructions
A dataset containing 6,003 GPT-generated human instruction and Solidity source code data pairs.
GPT models used to make this data are GPT-3.5 turbo, GPT-3.5 turbo 16k context, and GPT-4. Solidity source codes are used from mwritescode's Slither Audited Smart Contracts (https://huggingface.co/datasets/mwritescode/slither-audited-smart-contracts).
Distributions of the GPT models used to make this dataset:
- GPT-3.5 Turbo: 5,276
- GPT-3.5 Turbo 16k Context: 678
- GPT-4: 49
Solidity source codes in this dataset has been processed to replace triple or more newline characters with double newline characters and delete "Submitted for verification at " comments.
# Example Usage
```py
from datasets import load_dataset
# Load dataset
dataset = load_dataset("AlfredPros/smart-contracts-instructions", split="train")
# Print the first row instruction
print(dataset["instruction"][0])
```
|
AlfredPros/smart-contracts-instructions
|
[
"task_categories:question-answering",
"size_categories:1K<n<10K",
"language:en",
"code",
"blockchain",
"smart contract",
"solidity",
"region:us"
] |
2023-09-05T09:39:18+00:00
|
{"language": ["en"], "size_categories": ["1K<n<10K"], "task_categories": ["question-answering"], "tags": ["code", "blockchain", "smart contract", "solidity"], "viewer": true}
|
2023-12-02T09:12:25+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-question-answering #size_categories-1K<n<10K #language-English #code #blockchain #smart contract #solidity #region-us
|
# Smart Contracts Instructions
A dataset containing 6,003 GPT-generated human instruction and Solidity source code data pairs.
GPT models used to make this data are GPT-3.5 turbo, GPT-3.5 turbo 16k context, and GPT-4. Solidity source codes are used from mwritescode's Slither Audited Smart Contracts (URL
Distributions of the GPT models used to make this dataset:
- GPT-3.5 Turbo: 5,276
- GPT-3.5 Turbo 16k Context: 678
- GPT-4: 49
Solidity source codes in this dataset has been processed to replace triple or more newline characters with double newline characters and delete "Submitted for verification at " comments.
# Example Usage
|
[
"# Smart Contracts Instructions\n\nA dataset containing 6,003 GPT-generated human instruction and Solidity source code data pairs.\nGPT models used to make this data are GPT-3.5 turbo, GPT-3.5 turbo 16k context, and GPT-4. Solidity source codes are used from mwritescode's Slither Audited Smart Contracts (URL\n\nDistributions of the GPT models used to make this dataset:\n- GPT-3.5 Turbo: 5,276\n- GPT-3.5 Turbo 16k Context: 678\n- GPT-4: 49\n\nSolidity source codes in this dataset has been processed to replace triple or more newline characters with double newline characters and delete \"Submitted for verification at \" comments.",
"# Example Usage"
] |
[
"TAGS\n#task_categories-question-answering #size_categories-1K<n<10K #language-English #code #blockchain #smart contract #solidity #region-us \n",
"# Smart Contracts Instructions\n\nA dataset containing 6,003 GPT-generated human instruction and Solidity source code data pairs.\nGPT models used to make this data are GPT-3.5 turbo, GPT-3.5 turbo 16k context, and GPT-4. Solidity source codes are used from mwritescode's Slither Audited Smart Contracts (URL\n\nDistributions of the GPT models used to make this dataset:\n- GPT-3.5 Turbo: 5,276\n- GPT-3.5 Turbo 16k Context: 678\n- GPT-4: 49\n\nSolidity source codes in this dataset has been processed to replace triple or more newline characters with double newline characters and delete \"Submitted for verification at \" comments.",
"# Example Usage"
] |
[
45,
167,
5
] |
[
"passage: TAGS\n#task_categories-question-answering #size_categories-1K<n<10K #language-English #code #blockchain #smart contract #solidity #region-us \n# Smart Contracts Instructions\n\nA dataset containing 6,003 GPT-generated human instruction and Solidity source code data pairs.\nGPT models used to make this data are GPT-3.5 turbo, GPT-3.5 turbo 16k context, and GPT-4. Solidity source codes are used from mwritescode's Slither Audited Smart Contracts (URL\n\nDistributions of the GPT models used to make this dataset:\n- GPT-3.5 Turbo: 5,276\n- GPT-3.5 Turbo 16k Context: 678\n- GPT-4: 49\n\nSolidity source codes in this dataset has been processed to replace triple or more newline characters with double newline characters and delete \"Submitted for verification at \" comments.# Example Usage"
] |
77125f4649fbbe42c0aca0ee39009f821f83758c
|
# Dataset Card for "gametiles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
bratzzie/gametiles
|
[
"region:us"
] |
2023-09-05T09:46:37+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "images"}}}}], "splits": [{"name": "train", "num_bytes": 4153311.0, "num_examples": 34}], "download_size": 3706837, "dataset_size": 4153311.0}}
|
2023-09-05T11:12:48+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "gametiles"
More Information needed
|
[
"# Dataset Card for \"gametiles\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"gametiles\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"gametiles\"\n\nMore Information needed"
] |
576124483e4a46c71ae1296a753a3e6365cebe71
|
# Dataset Card for "BEBO_DS_UPDATED"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
gurprbebo/BEBO_DS_UPDATED
|
[
"region:us"
] |
2023-09-05T10:10:53+00:00
|
{"dataset_info": {"features": [{"name": "text", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2755, "num_examples": 9}], "download_size": 2776, "dataset_size": 2755}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]}
|
2023-09-05T10:16:01+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "BEBO_DS_UPDATED"
More Information needed
|
[
"# Dataset Card for \"BEBO_DS_UPDATED\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"BEBO_DS_UPDATED\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"BEBO_DS_UPDATED\"\n\nMore Information needed"
] |
86535a770b4de5a721b4242f4a127e26b208c456
|
# Dataset Card for "python-150_interduplication"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
antolin/python-150_interduplication
|
[
"region:us"
] |
2023-09-05T10:21:06+00:00
|
{"dataset_info": {"features": [{"name": "id_within_dataset", "dtype": "int64"}, {"name": "snippet", "dtype": "string"}, {"name": "tokens", "sequence": "string"}, {"name": "nl", "dtype": "string"}, {"name": "split_within_dataset", "dtype": "string"}, {"name": "is_duplicated", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 41621821.20269596, "num_examples": 40842}, {"name": "test", "num_bytes": 13915723.238891663, "num_examples": 13655}, {"name": "valid", "num_bytes": 13864768.55841238, "num_examples": 13605}], "download_size": 30588162, "dataset_size": 69402313.0}}
|
2023-11-10T11:57:59+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "python-150_interduplication"
More Information needed
|
[
"# Dataset Card for \"python-150_interduplication\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"python-150_interduplication\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"python-150_interduplication\"\n\nMore Information needed"
] |
052f01de644dba841176e0449528b41f27d94a61
|
# Bias in Bios
Bias in Bios was created by (De-Artega et al., 2019) and published under the MIT license (https://github.com/microsoft/biosbias). The dataset is used to investigate bias in NLP models. It consists of textual biographies used to predict professional occupations, the sensitive attribute is the gender (binary).
The version shared here is the version proposed by (Ravgofel et al., 2020) which slightly smaller due to the unavailability of 5,557 biographies.
The dataset is divided between train (257,000 samples), test (99,000 samples) and dev (40,000 samples) sets.
To load each all splits ('train', 'dev', 'test'), use the following code :
```python
train_dataset = load_dataset("LabHC/bias_in_bios", split='train')
test_dataset = load_dataset("LabHC/bias_in_bios", split='test')
dev_dataset = load_dataset("LabHC/bias_in_bios", split='dev')
```
Below are presented the classifiaction and sensitive attribtues labels and their proportion. Distributions are similar through the three sets.
#### Classification labels
| Profession | Numerical label | Proportion (%)| | Profession | Numerical label | Proportion (%)|
|---|---|---|---|---|---|---|
accountant | 0 | 1.42 | | nurse | 13 | 4.78
architect | 1 | 2.55 | | painter | 14 | 1.95
attorney | 2 | 8.22 | | paralegal | 15 | 0.45
chiropractor | 3 | 0.67 | | pastor | 16 | 0.64
comedian | 4 | 0.71 | | personal_trainer | 17 | 0.36
composer | 5 | 1.41 | | photographer | 18 | 6.13
dentist | 6 | 3.68 | | physician | 19 | 10.35
dietitian | 7 | 1.0 | | poet | 20 | 1.77
dj | 8 | 0.38 | | professor | 21 | 29.8
filmmaker | 9 | 1.77 | | psychologist | 22 | 4.64
interior_designer | 10 | 0.37 | | rapper | 23 | 0.35
journalist | 11 | 5.03 | | software_engineer | 24 | 1.74
model | 12 | 1.89 | | surgeon | 25 | 3.43
nurse | 13 | 4.78 | | teacher | 26 | 4.09
painter | 14 | 1.95 | | yoga_teacher | 27 | 0.42
#### Sensitive attributes
| Gender | Numerical label | Proportion (%)|
|---|---|---|
Male | 0 | 53.9 |
Female | 1 | 46.1
---
(De-Artega et al., 2019) Maria De-Arteaga, Alexey Romanov, Hanna Wallach, Jennifer Chayes, Christian Borgs, Alexandra Chouldechova, Sahin Geyik, Krishnaram Kenthapadi, and Adam Tauman Kalai. 2019. Bias in Bios: A Case Study of Semantic Representation Bias in a High-Stakes Setting. In Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT* '19). Association for Computing Machinery, New York, NY, USA, 120–128. https://doi.org/10.1145/3287560.3287572
(Ravgofel et al., 2020) Shauli Ravfogel, Yanai Elazar, Hila Gonen, Michael Twiton, and Yoav Goldberg. 2020. Null It Out: Guarding Protected Attributes by Iterative Nullspace Projection. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7237–7256, Online. Association for Computational Linguistics.
|
LabHC/bias_in_bios
|
[
"task_categories:text-classification",
"language:en",
"license:mit",
"region:us"
] |
2023-09-05T10:22:24+00:00
|
{"language": ["en"], "license": "mit", "task_categories": ["text-classification"], "dataset_info": {"features": [{"name": "hard_text", "dtype": "string"}, {"name": "profession", "dtype": "int64"}, {"name": "gender", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 107487885, "num_examples": 257478}, {"name": "test", "num_bytes": 41312256, "num_examples": 99069}, {"name": "dev", "num_bytes": 16504417, "num_examples": 39642}], "download_size": 99808338, "dataset_size": 165304558}}
|
2023-09-10T14:41:38+00:00
|
[] |
[
"en"
] |
TAGS
#task_categories-text-classification #language-English #license-mit #region-us
|
Bias in Bios
============
Bias in Bios was created by (De-Artega et al., 2019) and published under the MIT license (URL The dataset is used to investigate bias in NLP models. It consists of textual biographies used to predict professional occupations, the sensitive attribute is the gender (binary).
The version shared here is the version proposed by (Ravgofel et al., 2020) which slightly smaller due to the unavailability of 5,557 biographies.
The dataset is divided between train (257,000 samples), test (99,000 samples) and dev (40,000 samples) sets.
To load each all splits ('train', 'dev', 'test'), use the following code :
Below are presented the classifiaction and sensitive attribtues labels and their proportion. Distributions are similar through the three sets.
#### Classification labels
#### Sensitive attributes
Gender: Male, Numerical label: 0, Proportion (%): 53.9
Gender: Female, Numerical label: 1, Proportion (%): 46.1
---
(De-Artega et al., 2019) Maria De-Arteaga, Alexey Romanov, Hanna Wallach, Jennifer Chayes, Christian Borgs, Alexandra Chouldechova, Sahin Geyik, Krishnaram Kenthapadi, and Adam Tauman Kalai. 2019. Bias in Bios: A Case Study of Semantic Representation Bias in a High-Stakes Setting. In Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT\* '19). Association for Computing Machinery, New York, NY, USA, 120–128. URL
(Ravgofel et al., 2020) Shauli Ravfogel, Yanai Elazar, Hila Gonen, Michael Twiton, and Yoav Goldberg. 2020. Null It Out: Guarding Protected Attributes by Iterative Nullspace Projection. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7237–7256, Online. Association for Computational Linguistics.
|
[
"#### Classification labels",
"#### Sensitive attributes\n\n\nGender: Male, Numerical label: 0, Proportion (%): 53.9\nGender: Female, Numerical label: 1, Proportion (%): 46.1\n\n\n\n\n---\n\n\n(De-Artega et al., 2019) Maria De-Arteaga, Alexey Romanov, Hanna Wallach, Jennifer Chayes, Christian Borgs, Alexandra Chouldechova, Sahin Geyik, Krishnaram Kenthapadi, and Adam Tauman Kalai. 2019. Bias in Bios: A Case Study of Semantic Representation Bias in a High-Stakes Setting. In Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT\\* '19). Association for Computing Machinery, New York, NY, USA, 120–128. URL\n\n\n(Ravgofel et al., 2020) Shauli Ravfogel, Yanai Elazar, Hila Gonen, Michael Twiton, and Yoav Goldberg. 2020. Null It Out: Guarding Protected Attributes by Iterative Nullspace Projection. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7237–7256, Online. Association for Computational Linguistics."
] |
[
"TAGS\n#task_categories-text-classification #language-English #license-mit #region-us \n",
"#### Classification labels",
"#### Sensitive attributes\n\n\nGender: Male, Numerical label: 0, Proportion (%): 53.9\nGender: Female, Numerical label: 1, Proportion (%): 46.1\n\n\n\n\n---\n\n\n(De-Artega et al., 2019) Maria De-Arteaga, Alexey Romanov, Hanna Wallach, Jennifer Chayes, Christian Borgs, Alexandra Chouldechova, Sahin Geyik, Krishnaram Kenthapadi, and Adam Tauman Kalai. 2019. Bias in Bios: A Case Study of Semantic Representation Bias in a High-Stakes Setting. In Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT\\* '19). Association for Computing Machinery, New York, NY, USA, 120–128. URL\n\n\n(Ravgofel et al., 2020) Shauli Ravfogel, Yanai Elazar, Hila Gonen, Michael Twiton, and Yoav Goldberg. 2020. Null It Out: Guarding Protected Attributes by Iterative Nullspace Projection. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7237–7256, Online. Association for Computational Linguistics."
] |
[
26,
6,
281
] |
[
"passage: TAGS\n#task_categories-text-classification #language-English #license-mit #region-us \n#### Classification labels#### Sensitive attributes\n\n\nGender: Male, Numerical label: 0, Proportion (%): 53.9\nGender: Female, Numerical label: 1, Proportion (%): 46.1\n\n\n\n\n---\n\n\n(De-Artega et al., 2019) Maria De-Arteaga, Alexey Romanov, Hanna Wallach, Jennifer Chayes, Christian Borgs, Alexandra Chouldechova, Sahin Geyik, Krishnaram Kenthapadi, and Adam Tauman Kalai. 2019. Bias in Bios: A Case Study of Semantic Representation Bias in a High-Stakes Setting. In Proceedings of the Conference on Fairness, Accountability, and Transparency (FAT\\* '19). Association for Computing Machinery, New York, NY, USA, 120–128. URL\n\n\n(Ravgofel et al., 2020) Shauli Ravfogel, Yanai Elazar, Hila Gonen, Michael Twiton, and Yoav Goldberg. 2020. Null It Out: Guarding Protected Attributes by Iterative Nullspace Projection. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, pages 7237–7256, Online. Association for Computational Linguistics."
] |
b473a57574baea5e9f9d3a0c0abc57a4fbb27154
|
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
Nyamerdene/datasets
|
[
"region:us"
] |
2023-09-05T10:35:32+00:00
|
{}
|
2023-09-05T11:09:36+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Dataset Name
## Dataset Description
- Homepage:
- Repository:
- Paper:
- Leaderboard:
- Point of Contact:
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using this raw template.
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Dataset Name",
"## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:",
"### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
8,
24,
32,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Dataset Name## Dataset Description\n\n- Homepage: \n- Repository: \n- Paper: \n- Leaderboard: \n- Point of Contact:### Dataset Summary\n\nThis dataset card aims to be a base template for new datasets. It has been generated using this raw template.### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
1c718e99316c7cb5824752a2c89a87ccae076b1f
|
# Dataset Card for Evaluation run of Danielbrdz/CodeBarcenas-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Danielbrdz/CodeBarcenas-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [Danielbrdz/CodeBarcenas-7b](https://huggingface.co/Danielbrdz/CodeBarcenas-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Danielbrdz__CodeBarcenas-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T02:45:17.599730](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__CodeBarcenas-7b/blob/main/results_2023-09-18T02-45-17.599730.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119179,
"f1": 0.04712458053691294,
"f1_stderr": 0.0011987531964379016,
"acc": 0.31440371523474825,
"acc_stderr": 0.009024224601859619
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119179,
"f1": 0.04712458053691294,
"f1_stderr": 0.0011987531964379016
},
"harness|gsm8k|5": {
"acc": 0.025018953752843062,
"acc_stderr": 0.004302045046564279
},
"harness|winogrande|5": {
"acc": 0.6037884767166535,
"acc_stderr": 0.01374640415715496
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_Danielbrdz__CodeBarcenas-7b
|
[
"region:us"
] |
2023-09-05T11:22:22+00:00
|
{"pretty_name": "Evaluation run of Danielbrdz/CodeBarcenas-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [Danielbrdz/CodeBarcenas-7b](https://huggingface.co/Danielbrdz/CodeBarcenas-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Danielbrdz__CodeBarcenas-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-09-18T02:45:17.599730](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__CodeBarcenas-7b/blob/main/results_2023-09-18T02-45-17.599730.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119179,\n \"f1\": 0.04712458053691294,\n \"f1_stderr\": 0.0011987531964379016,\n \"acc\": 0.31440371523474825,\n \"acc_stderr\": 0.009024224601859619\n },\n \"harness|drop|3\": {\n \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119179,\n \"f1\": 0.04712458053691294,\n \"f1_stderr\": 0.0011987531964379016\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.025018953752843062,\n \"acc_stderr\": 0.004302045046564279\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6037884767166535,\n \"acc_stderr\": 0.01374640415715496\n }\n}\n```", "repo_url": "https://huggingface.co/Danielbrdz/CodeBarcenas-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|arc:challenge|25_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_09_18T02_45_17.599730", "path": ["**/details_harness|drop|3_2023-09-18T02-45-17.599730.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-09-18T02-45-17.599730.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_09_18T02_45_17.599730", "path": ["**/details_harness|gsm8k|5_2023-09-18T02-45-17.599730.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-09-18T02-45-17.599730.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hellaswag|10_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T12:21:59.082242.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T12:21:59.082242.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T12:21:59.082242.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_09_18T02_45_17.599730", "path": ["**/details_harness|winogrande|5_2023-09-18T02-45-17.599730.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-09-18T02-45-17.599730.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T12_21_59.082242", "path": ["results_2023-09-05T12:21:59.082242.parquet"]}, {"split": "2023_09_18T02_45_17.599730", "path": ["results_2023-09-18T02-45-17.599730.parquet"]}, {"split": "latest", "path": ["results_2023-09-18T02-45-17.599730.parquet"]}]}]}
|
2023-09-18T01:45:29+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of Danielbrdz/CodeBarcenas-7b
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model Danielbrdz/CodeBarcenas-7b on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-09-18T02:45:17.599730(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of Danielbrdz/CodeBarcenas-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Danielbrdz/CodeBarcenas-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T02:45:17.599730(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Danielbrdz/CodeBarcenas-7b",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model Danielbrdz/CodeBarcenas-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-09-18T02:45:17.599730(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
19,
31,
167,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Danielbrdz/CodeBarcenas-7b## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model Danielbrdz/CodeBarcenas-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-09-18T02:45:17.599730(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
71821217ad0037a15ca011308119959b25248964
|
# Dataset Card for "838a27e4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/838a27e4
|
[
"region:us"
] |
2023-09-05T11:23:24+00:00
|
{"dataset_info": {"features": [{"name": "result", "dtype": "string"}, {"name": "id", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 180, "num_examples": 10}], "download_size": 1341, "dataset_size": 180}}
|
2023-09-05T11:23:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "838a27e4"
More Information needed
|
[
"# Dataset Card for \"838a27e4\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"838a27e4\"\n\nMore Information needed"
] |
[
6,
15
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"838a27e4\"\n\nMore Information needed"
] |
f70dbff0df113ea1a9fab37cf84b3bb03eb44601
|
# Dataset Card for "balanced_structs_reduced_labelled_large_enc_key_name_addr_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
johannes-garstenauer/balanced_structs_reduced_labelled_large_enc_key_name_addr_split
|
[
"region:us"
] |
2023-09-05T11:41:44+00:00
|
{"dataset_info": {"features": [{"name": "struct", "dtype": "string"}, {"name": "label", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 43611213.0, "num_examples": 265791}, {"name": "test", "num_bytes": 2295327.0, "num_examples": 13989}], "download_size": 9152382, "dataset_size": 45906540.0}}
|
2023-09-05T11:41:57+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "balanced_structs_reduced_labelled_large_enc_key_name_addr_split"
More Information needed
|
[
"# Dataset Card for \"balanced_structs_reduced_labelled_large_enc_key_name_addr_split\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"balanced_structs_reduced_labelled_large_enc_key_name_addr_split\"\n\nMore Information needed"
] |
[
6,
37
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"balanced_structs_reduced_labelled_large_enc_key_name_addr_split\"\n\nMore Information needed"
] |
28e8544efaa82d810a9d556556dcf71c4525cd2e
|
# Dataset Card for "st-parallel-sentences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
Bingsu/st-parallel-sentences
|
[
"region:us"
] |
2023-09-05T11:54:57+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "en", "dtype": "string"}, {"name": "other", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 35774892810, "num_examples": 257055413}], "download_size": 22222052417, "dataset_size": 35774892810}}
|
2023-09-05T12:29:47+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "st-parallel-sentences"
More Information needed
|
[
"# Dataset Card for \"st-parallel-sentences\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"st-parallel-sentences\"\n\nMore Information needed"
] |
[
6,
18
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"st-parallel-sentences\"\n\nMore Information needed"
] |
8cdfe0ceee83c824ea5de6f5af75b2c6f153de33
|
# Dataset Card for "text_summary_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
mHossain/text_summary_v2
|
[
"region:us"
] |
2023-09-05T12:03:21+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "Unnamed: 0", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 2591192.7, "num_examples": 540}, {"name": "test", "num_bytes": 287910.3, "num_examples": 60}], "download_size": 1754170, "dataset_size": 2879103.0}}
|
2023-09-05T12:03:23+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "text_summary_v2"
More Information needed
|
[
"# Dataset Card for \"text_summary_v2\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"text_summary_v2\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"text_summary_v2\"\n\nMore Information needed"
] |
3c151b390329e01a0b261a434c4997be9d047b97
|
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-33b-2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-2.1](https://huggingface.co/jondurbin/airoboros-33b-2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-33b-2.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-22T06:51:11.330881](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-2.1/blob/main/results_2023-10-22T06-51-11.330881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3497273489932886,
"em_stderr": 0.004883736162511905,
"f1": 0.4412342701342311,
"f1_stderr": 0.004680103876700143,
"acc": 0.424060824343141,
"acc_stderr": 0.009219008635986778
},
"harness|drop|3": {
"em": 0.3497273489932886,
"em_stderr": 0.004883736162511905,
"f1": 0.4412342701342311,
"f1_stderr": 0.004680103876700143
},
"harness|gsm8k|5": {
"acc": 0.06595905989385899,
"acc_stderr": 0.0068369511920342305
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_jondurbin__airoboros-33b-2.1
|
[
"region:us"
] |
2023-09-05T12:06:16+00:00
|
{"pretty_name": "Evaluation run of jondurbin/airoboros-33b-2.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-2.1](https://huggingface.co/jondurbin/airoboros-33b-2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-33b-2.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-22T06:51:11.330881](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-2.1/blob/main/results_2023-10-22T06-51-11.330881.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3497273489932886,\n \"em_stderr\": 0.004883736162511905,\n \"f1\": 0.4412342701342311,\n \"f1_stderr\": 0.004680103876700143,\n \"acc\": 0.424060824343141,\n \"acc_stderr\": 0.009219008635986778\n },\n \"harness|drop|3\": {\n \"em\": 0.3497273489932886,\n \"em_stderr\": 0.004883736162511905,\n \"f1\": 0.4412342701342311,\n \"f1_stderr\": 0.004680103876700143\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06595905989385899,\n \"acc_stderr\": 0.0068369511920342305\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n }\n}\n```", "repo_url": "https://huggingface.co/jondurbin/airoboros-33b-2.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|arc:challenge|25_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_22T06_51_11.330881", "path": ["**/details_harness|drop|3_2023-10-22T06-51-11.330881.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-22T06-51-11.330881.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_22T06_51_11.330881", "path": ["**/details_harness|gsm8k|5_2023-10-22T06-51-11.330881.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-22T06-51-11.330881.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hellaswag|10_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T13:05:52.227014.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T13:05:52.227014.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T13:05:52.227014.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_22T06_51_11.330881", "path": ["**/details_harness|winogrande|5_2023-10-22T06-51-11.330881.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-22T06-51-11.330881.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T13_05_52.227014", "path": ["results_2023-09-05T13:05:52.227014.parquet"]}, {"split": "2023_10_22T06_51_11.330881", "path": ["results_2023-10-22T06-51-11.330881.parquet"]}, {"split": "latest", "path": ["results_2023-10-22T06-51-11.330881.parquet"]}]}]}
|
2023-10-22T05:51:24+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-2.1
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model jondurbin/airoboros-33b-2.1 on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-22T06:51:11.330881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of jondurbin/airoboros-33b-2.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-2.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T06:51:11.330881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of jondurbin/airoboros-33b-2.1",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-2.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-22T06:51:11.330881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
20,
31,
168,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of jondurbin/airoboros-33b-2.1## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model jondurbin/airoboros-33b-2.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-22T06:51:11.330881(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
e97dffcc25199827509515968ecd024ba96b2e82
|
A RL environment called Ships for the Godot Game Engine.
This environment was created with: https://github.com/edbeeching/godot_rl_agents
## Downloading the environment
After installing Godot RL Agents, download the environment with:
```
gdrl.env_from_hub -r edbeeching/godot_rl_Ships
```
|
edbeeching/godot_rl_Ships
|
[
"deep-reinforcement-learning",
"reinforcement-learning",
"godot-rl",
"environments",
"video-games",
"region:us"
] |
2023-09-05T12:09:08+00:00
|
{"library_name": "godot-rl", "tags": ["deep-reinforcement-learning", "reinforcement-learning", "godot-rl", "environments", "video-games"]}
|
2024-01-07T12:20:07+00:00
|
[] |
[] |
TAGS
#deep-reinforcement-learning #reinforcement-learning #godot-rl #environments #video-games #region-us
|
A RL environment called Ships for the Godot Game Engine.
This environment was created with: URL
## Downloading the environment
After installing Godot RL Agents, download the environment with:
|
[
"## Downloading the environment \n\nAfter installing Godot RL Agents, download the environment with:"
] |
[
"TAGS\n#deep-reinforcement-learning #reinforcement-learning #godot-rl #environments #video-games #region-us \n",
"## Downloading the environment \n\nAfter installing Godot RL Agents, download the environment with:"
] |
[
32,
20
] |
[
"passage: TAGS\n#deep-reinforcement-learning #reinforcement-learning #godot-rl #environments #video-games #region-us \n## Downloading the environment \n\nAfter installing Godot RL Agents, download the environment with:"
] |
57c8bebe463b3d3fca1da2832377c67eefcb45f3
|
# Dataset Card for "text_summary_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
mHossain/text_summary_v3
|
[
"region:us"
] |
2023-09-05T12:12:00+00:00
|
{"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "Unnamed: 0", "dtype": "int64"}, {"name": "text", "dtype": "string"}, {"name": "summary", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 234990.0, "num_examples": 45}, {"name": "test", "num_bytes": 26110.0, "num_examples": 5}], "download_size": 183680, "dataset_size": 261100.0}}
|
2023-09-05T12:12:02+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "text_summary_v3"
More Information needed
|
[
"# Dataset Card for \"text_summary_v3\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"text_summary_v3\"\n\nMore Information needed"
] |
[
6,
17
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"text_summary_v3\"\n\nMore Information needed"
] |
29beebd6bd15caa16d05169afdaaeb344e0185fe
|
# Dataset Card for Evaluation run of TFLai/Nova-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Nova-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** [email protected]
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Nova-13B](https://huggingface.co/TFLai/Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Nova-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-19T17:54:42.292513](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Nova-13B/blob/main/results_2023-10-19T17-54-42.292513.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.00975251677852349,
"em_stderr": 0.0010063982618519808,
"f1": 0.0883609479865776,
"f1_stderr": 0.0018862516568409098,
"acc": 0.42008337856104666,
"acc_stderr": 0.009344043651724289
},
"harness|drop|3": {
"em": 0.00975251677852349,
"em_stderr": 0.0010063982618519808,
"f1": 0.0883609479865776,
"f1_stderr": 0.0018862516568409098
},
"harness|gsm8k|5": {
"acc": 0.06747536012130402,
"acc_stderr": 0.006909475136357493
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091087
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed]
|
open-llm-leaderboard/details_TFLai__Nova-13B
|
[
"region:us"
] |
2023-09-05T12:22:04+00:00
|
{"pretty_name": "Evaluation run of TFLai/Nova-13B", "dataset_summary": "Dataset automatically created during the evaluation run of model [TFLai/Nova-13B](https://huggingface.co/TFLai/Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Nova-13B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-10-19T17:54:42.292513](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Nova-13B/blob/main/results_2023-10-19T17-54-42.292513.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.00975251677852349,\n \"em_stderr\": 0.0010063982618519808,\n \"f1\": 0.0883609479865776,\n \"f1_stderr\": 0.0018862516568409098,\n \"acc\": 0.42008337856104666,\n \"acc_stderr\": 0.009344043651724289\n },\n \"harness|drop|3\": {\n \"em\": 0.00975251677852349,\n \"em_stderr\": 0.0010063982618519808,\n \"f1\": 0.0883609479865776,\n \"f1_stderr\": 0.0018862516568409098\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06747536012130402,\n \"acc_stderr\": 0.006909475136357493\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091087\n }\n}\n```", "repo_url": "https://huggingface.co/TFLai/Nova-13B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|arc:challenge|25_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_drop_3", "data_files": [{"split": "2023_10_19T17_54_42.292513", "path": ["**/details_harness|drop|3_2023-10-19T17-54-42.292513.parquet"]}, {"split": "latest", "path": ["**/details_harness|drop|3_2023-10-19T17-54-42.292513.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_10_19T17_54_42.292513", "path": ["**/details_harness|gsm8k|5_2023-10-19T17-54-42.292513.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-10-19T17-54-42.292513.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hellaswag|10_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-management|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-virology|5_2023-09-05T13:21:41.017236.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T13:21:41.017236.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-09-05T13:21:41.017236.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_10_19T17_54_42.292513", "path": ["**/details_harness|winogrande|5_2023-10-19T17-54-42.292513.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-10-19T17-54-42.292513.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_09_05T13_21_41.017236", "path": ["results_2023-09-05T13:21:41.017236.parquet"]}, {"split": "2023_10_19T17_54_42.292513", "path": ["results_2023-10-19T17-54-42.292513.parquet"]}, {"split": "latest", "path": ["results_2023-10-19T17-54-42.292513.parquet"]}]}]}
|
2023-10-19T16:54:54+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for Evaluation run of TFLai/Nova-13B
## Dataset Description
- Homepage:
- Repository: URL
- Paper:
- Leaderboard: URL
- Point of Contact: clementine@URL
### Dataset Summary
Dataset automatically created during the evaluation run of model TFLai/Nova-13B on the Open LLM Leaderboard.
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-10-19T17:54:42.292513(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
### Supported Tasks and Leaderboards
### Languages
## Dataset Structure
### Data Instances
### Data Fields
### Data Splits
## Dataset Creation
### Curation Rationale
### Source Data
#### Initial Data Collection and Normalization
#### Who are the source language producers?
### Annotations
#### Annotation process
#### Who are the annotators?
### Personal and Sensitive Information
## Considerations for Using the Data
### Social Impact of Dataset
### Discussion of Biases
### Other Known Limitations
## Additional Information
### Dataset Curators
### Licensing Information
### Contributions
|
[
"# Dataset Card for Evaluation run of TFLai/Nova-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/Nova-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T17:54:42.292513(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of TFLai/Nova-13B",
"## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL",
"### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/Nova-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-10-19T17:54:42.292513(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"### Supported Tasks and Leaderboards",
"### Languages",
"## Dataset Structure",
"### Data Instances",
"### Data Fields",
"### Data Splits",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Initial Data Collection and Normalization",
"#### Who are the source language producers?",
"### Annotations",
"#### Annotation process",
"#### Who are the annotators?",
"### Personal and Sensitive Information",
"## Considerations for Using the Data",
"### Social Impact of Dataset",
"### Discussion of Biases",
"### Other Known Limitations",
"## Additional Information",
"### Dataset Curators",
"### Licensing Information",
"### Contributions"
] |
[
6,
17,
31,
165,
67,
10,
4,
6,
6,
5,
5,
5,
7,
4,
10,
10,
5,
5,
9,
8,
8,
7,
8,
7,
5,
6,
6,
5
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of TFLai/Nova-13B## Dataset Description\n\n- Homepage: \n- Repository: URL\n- Paper: \n- Leaderboard: URL\n- Point of Contact: clementine@URL### Dataset Summary\n\nDataset automatically created during the evaluation run of model TFLai/Nova-13B on the Open LLM Leaderboard.\n\nThe dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-10-19T17:54:42.292513(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):### Supported Tasks and Leaderboards### Languages## Dataset Structure### Data Instances### Data Fields### Data Splits## Dataset Creation### Curation Rationale### Source Data#### Initial Data Collection and Normalization#### Who are the source language producers?### Annotations#### Annotation process#### Who are the annotators?### Personal and Sensitive Information## Considerations for Using the Data### Social Impact of Dataset### Discussion of Biases### Other Known Limitations## Additional Information### Dataset Curators### Licensing Information### Contributions"
] |
8e790ad093dec834eda8e5ccabdbf047a6ddb0f6
|
# Dataset Card for "ActorData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
dshut002/ActorData
|
[
"region:us"
] |
2023-09-05T12:46:57+00:00
|
{"dataset_info": {"features": [{"name": "input", "dtype": "string"}, {"name": "output", "dtype": "string"}, {"name": "instruction", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 51939, "num_examples": 100}], "download_size": 28059, "dataset_size": 51939}}
|
2023-09-05T15:56:26+00:00
|
[] |
[] |
TAGS
#region-us
|
# Dataset Card for "ActorData"
More Information needed
|
[
"# Dataset Card for \"ActorData\"\n\nMore Information needed"
] |
[
"TAGS\n#region-us \n",
"# Dataset Card for \"ActorData\"\n\nMore Information needed"
] |
[
6,
13
] |
[
"passage: TAGS\n#region-us \n# Dataset Card for \"ActorData\"\n\nMore Information needed"
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.