sha
stringlengths 40
40
| text
stringlengths 1
13.4M
| id
stringlengths 2
117
| tags
listlengths 1
7.91k
| created_at
stringlengths 25
25
| metadata
stringlengths 2
875k
| last_modified
stringlengths 25
25
| arxiv
listlengths 0
25
| languages
listlengths 0
7.91k
| tags_str
stringlengths 17
159k
| text_str
stringlengths 1
447k
| text_lists
listlengths 0
352
| processed_texts
listlengths 1
353
| tokens_length
listlengths 1
353
| input_texts
listlengths 1
40
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
766322593df22bf271658fc7fc080c46ff1a66d7 | # Dataset Card for "dailydialogsample-synonym_adjective"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dailydialogsample-synonym_adjective | [
"region:us"
] | 2023-12-13T07:28:58+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 15998, "num_examples": 25}], "download_size": 18133, "dataset_size": 15998}} | 2023-12-13T07:29:00+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dailydialogsample-synonym_adjective"
More Information needed | [
"# Dataset Card for \"dailydialogsample-synonym_adjective\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dailydialogsample-synonym_adjective\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dailydialogsample-synonym_adjective\"\n\nMore Information needed"
] |
01056972a487cef01dc4f5c0e66323b3ce98b21e | # Dataset Card for "dailydialogsample-repeat_itself"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dailydialogsample-repeat_itself | [
"region:us"
] | 2023-12-13T07:29:01+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 45844, "num_examples": 100}], "download_size": 35782, "dataset_size": 45844}} | 2023-12-13T07:29:03+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dailydialogsample-repeat_itself"
More Information needed | [
"# Dataset Card for \"dailydialogsample-repeat_itself\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dailydialogsample-repeat_itself\"\n\nMore Information needed"
] | [
6,
22
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dailydialogsample-repeat_itself\"\n\nMore Information needed"
] |
d47619d00c316c162182ac0198334166cf8b7322 | # Dataset Card for "dailydialogsample-repeat_last_speaker"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dailydialogsample-repeat_last_speaker | [
"region:us"
] | 2023-12-13T07:29:04+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 46577, "num_examples": 100}], "download_size": 36122, "dataset_size": 46577}} | 2023-12-13T07:29:07+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dailydialogsample-repeat_last_speaker"
More Information needed | [
"# Dataset Card for \"dailydialogsample-repeat_last_speaker\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dailydialogsample-repeat_last_speaker\"\n\nMore Information needed"
] | [
6,
25
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dailydialogsample-repeat_last_speaker\"\n\nMore Information needed"
] |
67b573afbd3fe4fc2ebd2ee35e423468d123cade | # Dataset Card for "dailydialogsample-negate_previous_utterance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dailydialogsample-negate_previous_utterance | [
"region:us"
] | 2023-12-13T07:29:07+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 45417, "num_examples": 100}], "download_size": 35523, "dataset_size": 45417}} | 2023-12-13T07:29:10+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dailydialogsample-negate_previous_utterance"
More Information needed | [
"# Dataset Card for \"dailydialogsample-negate_previous_utterance\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dailydialogsample-negate_previous_utterance\"\n\nMore Information needed"
] | [
6,
26
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dailydialogsample-negate_previous_utterance\"\n\nMore Information needed"
] |
a317c7d52bf2ee3b08ccff3d6a40b85bf29f6c73 | # Dataset Card for "dreamsample-expansions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dreamsample-expansions | [
"region:us"
] | 2023-12-13T07:29:10+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 29363, "num_examples": 53}], "download_size": 23841, "dataset_size": 29363}} | 2023-12-13T07:29:13+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dreamsample-expansions"
More Information needed | [
"# Dataset Card for \"dreamsample-expansions\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dreamsample-expansions\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dreamsample-expansions\"\n\nMore Information needed"
] |
aab5919b6603f8c57a9435b4c228905d20e19218 | # Dataset Card for "dreamsample-synonym_adjective"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dreamsample-synonym_adjective | [
"region:us"
] | 2023-12-13T07:29:13+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 17017, "num_examples": 29}], "download_size": 16463, "dataset_size": 17017}} | 2023-12-13T07:29:15+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dreamsample-synonym_adjective"
More Information needed | [
"# Dataset Card for \"dreamsample-synonym_adjective\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dreamsample-synonym_adjective\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dreamsample-synonym_adjective\"\n\nMore Information needed"
] |
8722b9c0e5a3442769c7bfb8d269566f834c487a | # Dataset Card for "dreamsample-repeat_itself"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dreamsample-repeat_itself | [
"region:us"
] | 2023-12-13T07:29:16+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 50690, "num_examples": 100}], "download_size": 38085, "dataset_size": 50690}} | 2023-12-13T07:29:18+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dreamsample-repeat_itself"
More Information needed | [
"# Dataset Card for \"dreamsample-repeat_itself\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dreamsample-repeat_itself\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dreamsample-repeat_itself\"\n\nMore Information needed"
] |
2234f24945848311fde89d50783cfec6d21dd3e5 | # Dataset Card for "dreamsample-repeat_last_speaker"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dreamsample-repeat_last_speaker | [
"region:us"
] | 2023-12-13T07:29:19+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 49282, "num_examples": 100}], "download_size": 36559, "dataset_size": 49282}} | 2023-12-13T07:29:21+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dreamsample-repeat_last_speaker"
More Information needed | [
"# Dataset Card for \"dreamsample-repeat_last_speaker\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dreamsample-repeat_last_speaker\"\n\nMore Information needed"
] | [
6,
24
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dreamsample-repeat_last_speaker\"\n\nMore Information needed"
] |
27de33c8e95bc3554458a8cd212a73a825d01231 | # Dataset Card for "dreamsample-negate_previous_utterance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dreamsample-negate_previous_utterance | [
"region:us"
] | 2023-12-13T07:29:22+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 50660, "num_examples": 100}], "download_size": 37581, "dataset_size": 50660}} | 2023-12-13T07:29:24+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dreamsample-negate_previous_utterance"
More Information needed | [
"# Dataset Card for \"dreamsample-negate_previous_utterance\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dreamsample-negate_previous_utterance\"\n\nMore Information needed"
] | [
6,
25
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dreamsample-negate_previous_utterance\"\n\nMore Information needed"
] |
aa9d1ce36e6d0df679c131cba1444647fb935315 | # Dataset Card for "mutualsample-expansions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/mutualsample-expansions | [
"region:us"
] | 2023-12-13T07:29:25+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 26216, "num_examples": 47}], "download_size": 23053, "dataset_size": 26216}} | 2023-12-13T07:29:27+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "mutualsample-expansions"
More Information needed | [
"# Dataset Card for \"mutualsample-expansions\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"mutualsample-expansions\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"mutualsample-expansions\"\n\nMore Information needed"
] |
c66dfab6d0a872444688e04ce7c10cb9c2c82879 | # Dataset Card for "mutualsample-synonym_adjective"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/mutualsample-synonym_adjective | [
"region:us"
] | 2023-12-13T07:29:28+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13906, "num_examples": 22}], "download_size": 15306, "dataset_size": 13906}} | 2023-12-13T07:29:30+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "mutualsample-synonym_adjective"
More Information needed | [
"# Dataset Card for \"mutualsample-synonym_adjective\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"mutualsample-synonym_adjective\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"mutualsample-synonym_adjective\"\n\nMore Information needed"
] |
c52e605eb026aec5ec9170dbbfb15045471cdbdc | # Dataset Card for "mutualsample-repeat_itself"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/mutualsample-repeat_itself | [
"region:us"
] | 2023-12-13T07:29:31+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 50858, "num_examples": 100}], "download_size": 38783, "dataset_size": 50858}} | 2023-12-13T07:29:33+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "mutualsample-repeat_itself"
More Information needed | [
"# Dataset Card for \"mutualsample-repeat_itself\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"mutualsample-repeat_itself\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"mutualsample-repeat_itself\"\n\nMore Information needed"
] |
841948e0bc039a0a0efac7e5c50abe87ee898a6e | # Dataset Card for "mutualsample-repeat_last_speaker"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/mutualsample-repeat_last_speaker | [
"region:us"
] | 2023-12-13T07:29:34+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 49856, "num_examples": 100}], "download_size": 38655, "dataset_size": 49856}} | 2023-12-13T07:29:36+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "mutualsample-repeat_last_speaker"
More Information needed | [
"# Dataset Card for \"mutualsample-repeat_last_speaker\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"mutualsample-repeat_last_speaker\"\n\nMore Information needed"
] | [
6,
24
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"mutualsample-repeat_last_speaker\"\n\nMore Information needed"
] |
c1aedd6e97d57bbe5e672a26fe216561946904e5 | # Dataset Card for "mutualsample-negate_previous_utterance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/mutualsample-negate_previous_utterance | [
"region:us"
] | 2023-12-13T07:29:37+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 50606, "num_examples": 100}], "download_size": 38241, "dataset_size": 50606}} | 2023-12-13T07:29:39+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "mutualsample-negate_previous_utterance"
More Information needed | [
"# Dataset Card for \"mutualsample-negate_previous_utterance\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"mutualsample-negate_previous_utterance\"\n\nMore Information needed"
] | [
6,
25
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"mutualsample-negate_previous_utterance\"\n\nMore Information needed"
] |
7a3363b58944541f6eeda0a64ed515d39cf25c89 | # Dataset Card for "personachatsample-expansions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/personachatsample-expansions | [
"region:us"
] | 2023-12-13T07:29:40+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7250, "num_examples": 20}], "download_size": 9048, "dataset_size": 7250}} | 2023-12-13T07:29:42+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "personachatsample-expansions"
More Information needed | [
"# Dataset Card for \"personachatsample-expansions\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"personachatsample-expansions\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"personachatsample-expansions\"\n\nMore Information needed"
] |
16f18d666d2fb7042477efd64bf0c2455e5ab7a6 | # Dataset Card for "personachatsample-synonym_adjective"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/personachatsample-synonym_adjective | [
"region:us"
] | 2023-12-13T07:29:43+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 12580, "num_examples": 32}], "download_size": 13006, "dataset_size": 12580}} | 2023-12-13T07:29:46+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "personachatsample-synonym_adjective"
More Information needed | [
"# Dataset Card for \"personachatsample-synonym_adjective\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"personachatsample-synonym_adjective\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"personachatsample-synonym_adjective\"\n\nMore Information needed"
] |
8b897df40283c60acb6251b0cd100a3754b73859 | # Dataset Card for "personachatsample-repeat_itself"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/personachatsample-repeat_itself | [
"region:us"
] | 2023-12-13T07:29:46+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 35611, "num_examples": 100}], "download_size": 27419, "dataset_size": 35611}} | 2023-12-13T07:29:49+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "personachatsample-repeat_itself"
More Information needed | [
"# Dataset Card for \"personachatsample-repeat_itself\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"personachatsample-repeat_itself\"\n\nMore Information needed"
] | [
6,
21
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"personachatsample-repeat_itself\"\n\nMore Information needed"
] |
b87a073932ae883b44b3e52d110396946dd5d621 | # Dataset Card for "personachatsample-repeat_last_speaker"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/personachatsample-repeat_last_speaker | [
"region:us"
] | 2023-12-13T07:29:49+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 36057, "num_examples": 100}], "download_size": 27841, "dataset_size": 36057}} | 2023-12-13T07:29:52+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "personachatsample-repeat_last_speaker"
More Information needed | [
"# Dataset Card for \"personachatsample-repeat_last_speaker\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"personachatsample-repeat_last_speaker\"\n\nMore Information needed"
] | [
6,
24
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"personachatsample-repeat_last_speaker\"\n\nMore Information needed"
] |
48761dd7b5762a50cef6fb80a8a0469b5ae0ac7a | # Dataset Card for "personachatsample-negate_previous_utterance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/personachatsample-negate_previous_utterance | [
"region:us"
] | 2023-12-13T07:29:52+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 35605, "num_examples": 100}], "download_size": 27177, "dataset_size": 35605}} | 2023-12-13T07:29:55+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "personachatsample-negate_previous_utterance"
More Information needed | [
"# Dataset Card for \"personachatsample-negate_previous_utterance\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"personachatsample-negate_previous_utterance\"\n\nMore Information needed"
] | [
6,
25
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"personachatsample-negate_previous_utterance\"\n\nMore Information needed"
] |
9f619cdca8679775a918846332c42561866045f1 | # Stained Glass Art Dataset for Diffusion Models
## Overview
This dataset consists of 21 high-resolution images of stained glass art, accompanied by corresponding captions. It is designed for fine-tuning diffusion models using techniques such as textual inversion and dreambooth. The dataset is intended to facilitate research and experimentation in generating stained glass art-inspired images.
## Dataset Structure
- **Images:** The stained glass art images are stored in the "images" directory, with filenames ranging from "0.jpg" to "20.jpg."
- **Captions:** Captions for each image are provided in the "captions.csv" file located in the dataset's root directory. The captions contain placeholders for adjectives and a custom token to represent stained glass art. For example: "A {adjective} {token} of a puppy."
- **Adjective Placeholders:** During training, the {adjective} placeholder in the captions can be randomly selected from the following list: `["", "good", "cropped", "clean", "bright", "cool", "nice", "small", "large", "dark", "weird"]`.
- **Token Placeholder:** The {token} placeholder represents the custom token that needs to be trained to capture the unique art style of stained glass. This token is a key element in generating realistic stained glass art-inspired images.
| abinthomasonline/stained-glass | [
"task_categories:text-to-image",
"task_categories:image-to-image",
"task_categories:unconditional-image-generation",
"size_categories:n<1K",
"license:mit",
"art",
"region:us"
] | 2023-12-13T07:50:00+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image", "image-to-image", "unconditional-image-generation"], "pretty_name": "stained", "tags": ["art"]} | 2023-12-13T19:39:15+00:00 | [] | [] | TAGS
#task_categories-text-to-image #task_categories-image-to-image #task_categories-unconditional-image-generation #size_categories-n<1K #license-mit #art #region-us
| # Stained Glass Art Dataset for Diffusion Models
## Overview
This dataset consists of 21 high-resolution images of stained glass art, accompanied by corresponding captions. It is designed for fine-tuning diffusion models using techniques such as textual inversion and dreambooth. The dataset is intended to facilitate research and experimentation in generating stained glass art-inspired images.
## Dataset Structure
- Images: The stained glass art images are stored in the "images" directory, with filenames ranging from "0.jpg" to "URL."
- Captions: Captions for each image are provided in the "URL" file located in the dataset's root directory. The captions contain placeholders for adjectives and a custom token to represent stained glass art. For example: "A {adjective} {token} of a puppy."
- Adjective Placeholders: During training, the {adjective} placeholder in the captions can be randomly selected from the following list: '["", "good", "cropped", "clean", "bright", "cool", "nice", "small", "large", "dark", "weird"]'.
- Token Placeholder: The {token} placeholder represents the custom token that needs to be trained to capture the unique art style of stained glass. This token is a key element in generating realistic stained glass art-inspired images.
| [
"# Stained Glass Art Dataset for Diffusion Models",
"## Overview\nThis dataset consists of 21 high-resolution images of stained glass art, accompanied by corresponding captions. It is designed for fine-tuning diffusion models using techniques such as textual inversion and dreambooth. The dataset is intended to facilitate research and experimentation in generating stained glass art-inspired images.",
"## Dataset Structure\n- Images: The stained glass art images are stored in the \"images\" directory, with filenames ranging from \"0.jpg\" to \"URL.\"\n- Captions: Captions for each image are provided in the \"URL\" file located in the dataset's root directory. The captions contain placeholders for adjectives and a custom token to represent stained glass art. For example: \"A {adjective} {token} of a puppy.\"\n- Adjective Placeholders: During training, the {adjective} placeholder in the captions can be randomly selected from the following list: '[\"\", \"good\", \"cropped\", \"clean\", \"bright\", \"cool\", \"nice\", \"small\", \"large\", \"dark\", \"weird\"]'.\n- Token Placeholder: The {token} placeholder represents the custom token that needs to be trained to capture the unique art style of stained glass. This token is a key element in generating realistic stained glass art-inspired images."
] | [
"TAGS\n#task_categories-text-to-image #task_categories-image-to-image #task_categories-unconditional-image-generation #size_categories-n<1K #license-mit #art #region-us \n",
"# Stained Glass Art Dataset for Diffusion Models",
"## Overview\nThis dataset consists of 21 high-resolution images of stained glass art, accompanied by corresponding captions. It is designed for fine-tuning diffusion models using techniques such as textual inversion and dreambooth. The dataset is intended to facilitate research and experimentation in generating stained glass art-inspired images.",
"## Dataset Structure\n- Images: The stained glass art images are stored in the \"images\" directory, with filenames ranging from \"0.jpg\" to \"URL.\"\n- Captions: Captions for each image are provided in the \"URL\" file located in the dataset's root directory. The captions contain placeholders for adjectives and a custom token to represent stained glass art. For example: \"A {adjective} {token} of a puppy.\"\n- Adjective Placeholders: During training, the {adjective} placeholder in the captions can be randomly selected from the following list: '[\"\", \"good\", \"cropped\", \"clean\", \"bright\", \"cool\", \"nice\", \"small\", \"large\", \"dark\", \"weird\"]'.\n- Token Placeholder: The {token} placeholder represents the custom token that needs to be trained to capture the unique art style of stained glass. This token is a key element in generating realistic stained glass art-inspired images."
] | [
62,
13,
80,
247
] | [
"passage: TAGS\n#task_categories-text-to-image #task_categories-image-to-image #task_categories-unconditional-image-generation #size_categories-n<1K #license-mit #art #region-us \n# Stained Glass Art Dataset for Diffusion Models## Overview\nThis dataset consists of 21 high-resolution images of stained glass art, accompanied by corresponding captions. It is designed for fine-tuning diffusion models using techniques such as textual inversion and dreambooth. The dataset is intended to facilitate research and experimentation in generating stained glass art-inspired images.## Dataset Structure\n- Images: The stained glass art images are stored in the \"images\" directory, with filenames ranging from \"0.jpg\" to \"URL.\"\n- Captions: Captions for each image are provided in the \"URL\" file located in the dataset's root directory. The captions contain placeholders for adjectives and a custom token to represent stained glass art. For example: \"A {adjective} {token} of a puppy.\"\n- Adjective Placeholders: During training, the {adjective} placeholder in the captions can be randomly selected from the following list: '[\"\", \"good\", \"cropped\", \"clean\", \"bright\", \"cool\", \"nice\", \"small\", \"large\", \"dark\", \"weird\"]'.\n- Token Placeholder: The {token} placeholder represents the custom token that needs to be trained to capture the unique art style of stained glass. This token is a key element in generating realistic stained glass art-inspired images."
] |
7aa07bfa2773d6fd9e210dc82ca4ae89ff6eeccb |
# Bangumi Image Base of Cross Ange - Tenshi To Ryuu No Rondo
This is the image base of bangumi Cross Ange - Tenshi to Ryuu no Rondo, we detected 67 characters, 4478 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 40 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 111 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 30 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 22 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 229 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 64 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 117 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 67 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 179 | [Download](8/dataset.zip) |  |  |  |  |  |  |  |  |
| 9 | 40 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 38 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 40 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 24 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 32 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 79 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 69 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 21 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 28 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 201 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 90 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 601 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 121 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 95 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 56 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 96 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 25 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 32 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 18 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 13 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 37 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 13 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 12 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 24 | [Download](32/dataset.zip) |  |  |  |  |  |  |  |  |
| 33 | 156 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 16 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 74 | [Download](35/dataset.zip) |  |  |  |  |  |  |  |  |
| 36 | 19 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 14 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| 38 | 80 | [Download](38/dataset.zip) |  |  |  |  |  |  |  |  |
| 39 | 33 | [Download](39/dataset.zip) |  |  |  |  |  |  |  |  |
| 40 | 23 | [Download](40/dataset.zip) |  |  |  |  |  |  |  |  |
| 41 | 8 | [Download](41/dataset.zip) |  |  |  |  |  |  |  |  |
| 42 | 24 | [Download](42/dataset.zip) |  |  |  |  |  |  |  |  |
| 43 | 21 | [Download](43/dataset.zip) |  |  |  |  |  |  |  |  |
| 44 | 23 | [Download](44/dataset.zip) |  |  |  |  |  |  |  |  |
| 45 | 195 | [Download](45/dataset.zip) |  |  |  |  |  |  |  |  |
| 46 | 9 | [Download](46/dataset.zip) |  |  |  |  |  |  |  |  |
| 47 | 23 | [Download](47/dataset.zip) |  |  |  |  |  |  |  |  |
| 48 | 15 | [Download](48/dataset.zip) |  |  |  |  |  |  |  |  |
| 49 | 42 | [Download](49/dataset.zip) |  |  |  |  |  |  |  |  |
| 50 | 62 | [Download](50/dataset.zip) |  |  |  |  |  |  |  |  |
| 51 | 24 | [Download](51/dataset.zip) |  |  |  |  |  |  |  |  |
| 52 | 8 | [Download](52/dataset.zip) |  |  |  |  |  |  |  |  |
| 53 | 16 | [Download](53/dataset.zip) |  |  |  |  |  |  |  |  |
| 54 | 279 | [Download](54/dataset.zip) |  |  |  |  |  |  |  |  |
| 55 | 42 | [Download](55/dataset.zip) |  |  |  |  |  |  |  |  |
| 56 | 7 | [Download](56/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 57 | 115 | [Download](57/dataset.zip) |  |  |  |  |  |  |  |  |
| 58 | 19 | [Download](58/dataset.zip) |  |  |  |  |  |  |  |  |
| 59 | 20 | [Download](59/dataset.zip) |  |  |  |  |  |  |  |  |
| 60 | 8 | [Download](60/dataset.zip) |  |  |  |  |  |  |  |  |
| 61 | 18 | [Download](61/dataset.zip) |  |  |  |  |  |  |  |  |
| 62 | 6 | [Download](62/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 63 | 9 | [Download](63/dataset.zip) |  |  |  |  |  |  |  |  |
| 64 | 5 | [Download](64/dataset.zip) |  |  |  |  |  | N/A | N/A | N/A |
| 65 | 6 | [Download](65/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| noise | 395 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| BangumiBase/crossangetenshitoryuunorondo | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | 2023-12-13T08:05:26+00:00 | {"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]} | 2023-12-13T13:21:55+00:00 | [] | [] | TAGS
#size_categories-1K<n<10K #license-mit #art #region-us
| Bangumi Image Base of Cross Ange - Tenshi To Ryuu No Rondo
==========================================================
This is the image base of bangumi Cross Ange - Tenshi to Ryuu no Rondo, we detected 67 characters, 4478 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| [] | [
"TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
] | [
25
] | [
"passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
] |
87463586aa119e09ba858d61e846e54e5e33b159 | ## 训练数据/Training Data
百万级语料中文54%,英文46%;其中其中数据集包括**12**个领域包括金融,社会,生物,商业,工业制造,化学,车辆,科学,疾病医疗,个人生活,安全和通用。覆盖数百个使用场景
- NER:中文覆盖**28**个实体类型包括人物,地缘政治,组织,身体部位,药物等,英文覆盖**130**个实体类型包括Animal, Weapon, Conference, Book等。
- RE:中文覆盖**232**种关系包括买资,增持,重组,国籍,别名,亲属,入股,转让,导致,发生地点,制造商等,英文覆盖**236**种关系包括founded by,state or province of headquarters,employee of,occupation,creator等。
- EE:中文覆盖**84**种事件类型,包括中标,高管变动,产品行为-发布,公司上市等,和**203**种论元,英文覆盖**45**种事件类型,包括Born, Demonstrate, Meet, End Organization, Divorce等,和**62**种论元。
In the corpus of over a million entries, 54% are in Chinese and 46% in English. The dataset encompasses 12 fields including finance, society, biology, business, industrial manufacturing, chemistry, vehicles, science, disease and medicine, personal life, security, and general topics, covering hundreds of scenarios:
- NER: In Chinese, it covers **28** types of entities including individuals, geopolitics, organizations, body parts, drugs, etc., while in English, it covers 130 types of entities such as Animals, Weapons, Conferences, Books, etc.
- RE: In Chinese, it includes **232** types of relations like acquisitions, stake increases, restructurings, nationality, aliases, relatives, buying shares, transfers, causes, locations of occurrence, manufacturers, etc., and in English, 236 types of relations such as founded by, state or province of headquarters, employee of, occupation, creator, etc.
- EE: Chinese covers **84** types of events including winning a bid, executive changes, product actions - launches, company listings, etc., and **203** types of arguments, whereas English covers **45** types of events such as Birth, Demonstration, Meeting, End of Organization, Divorce, etc., and **62** types of arguments.
 | wenge-research/yayi_uie_sft_data | [
"size_categories:1M<n<10M",
"language:zh",
"language:en",
"license:apache-2.0",
"region:us"
] | 2023-12-13T08:27:48+00:00 | {"language": ["zh", "en"], "license": "apache-2.0", "size_categories": ["1M<n<10M"]} | 2023-12-14T13:52:33+00:00 | [] | [
"zh",
"en"
] | TAGS
#size_categories-1M<n<10M #language-Chinese #language-English #license-apache-2.0 #region-us
| ## 训练数据/Training Data
百万级语料中文54%,英文46%;其中其中数据集包括12个领域包括金融,社会,生物,商业,工业制造,化学,车辆,科学,疾病医疗,个人生活,安全和通用。覆盖数百个使用场景
- NER:中文覆盖28个实体类型包括人物,地缘政治,组织,身体部位,药物等,英文覆盖130个实体类型包括Animal, Weapon, Conference, Book等。
- RE:中文覆盖232种关系包括买资,增持,重组,国籍,别名,亲属,入股,转让,导致,发生地点,制造商等,英文覆盖236种关系包括founded by,state or province of headquarters,employee of,occupation,creator等。
- EE:中文覆盖84种事件类型,包括中标,高管变动,产品行为-发布,公司上市等,和203种论元,英文覆盖45种事件类型,包括Born, Demonstrate, Meet, End Organization, Divorce等,和62种论元。
In the corpus of over a million entries, 54% are in Chinese and 46% in English. The dataset encompasses 12 fields including finance, society, biology, business, industrial manufacturing, chemistry, vehicles, science, disease and medicine, personal life, security, and general topics, covering hundreds of scenarios:
- NER: In Chinese, it covers 28 types of entities including individuals, geopolitics, organizations, body parts, drugs, etc., while in English, it covers 130 types of entities such as Animals, Weapons, Conferences, Books, etc.
- RE: In Chinese, it includes 232 types of relations like acquisitions, stake increases, restructurings, nationality, aliases, relatives, buying shares, transfers, causes, locations of occurrence, manufacturers, etc., and in English, 236 types of relations such as founded by, state or province of headquarters, employee of, occupation, creator, etc.
- EE: Chinese covers 84 types of events including winning a bid, executive changes, product actions - launches, company listings, etc., and 203 types of arguments, whereas English covers 45 types of events such as Birth, Demonstration, Meeting, End of Organization, Divorce, etc., and 62 types of arguments.
!数据分布 | [
"## 训练数据/Training Data\n百万级语料中文54%,英文46%;其中其中数据集包括12个领域包括金融,社会,生物,商业,工业制造,化学,车辆,科学,疾病医疗,个人生活,安全和通用。覆盖数百个使用场景\n- NER:中文覆盖28个实体类型包括人物,地缘政治,组织,身体部位,药物等,英文覆盖130个实体类型包括Animal, Weapon, Conference, Book等。\n- RE:中文覆盖232种关系包括买资,增持,重组,国籍,别名,亲属,入股,转让,导致,发生地点,制造商等,英文覆盖236种关系包括founded by,state or province of headquarters,employee of,occupation,creator等。\n- EE:中文覆盖84种事件类型,包括中标,高管变动,产品行为-发布,公司上市等,和203种论元,英文覆盖45种事件类型,包括Born, Demonstrate, Meet, End Organization, Divorce等,和62种论元。\n\nIn the corpus of over a million entries, 54% are in Chinese and 46% in English. The dataset encompasses 12 fields including finance, society, biology, business, industrial manufacturing, chemistry, vehicles, science, disease and medicine, personal life, security, and general topics, covering hundreds of scenarios:\n\n- NER: In Chinese, it covers 28 types of entities including individuals, geopolitics, organizations, body parts, drugs, etc., while in English, it covers 130 types of entities such as Animals, Weapons, Conferences, Books, etc.\n- RE: In Chinese, it includes 232 types of relations like acquisitions, stake increases, restructurings, nationality, aliases, relatives, buying shares, transfers, causes, locations of occurrence, manufacturers, etc., and in English, 236 types of relations such as founded by, state or province of headquarters, employee of, occupation, creator, etc.\n- EE: Chinese covers 84 types of events including winning a bid, executive changes, product actions - launches, company listings, etc., and 203 types of arguments, whereas English covers 45 types of events such as Birth, Demonstration, Meeting, End of Organization, Divorce, etc., and 62 types of arguments.\n\n!数据分布"
] | [
"TAGS\n#size_categories-1M<n<10M #language-Chinese #language-English #license-apache-2.0 #region-us \n",
"## 训练数据/Training Data\n百万级语料中文54%,英文46%;其中其中数据集包括12个领域包括金融,社会,生物,商业,工业制造,化学,车辆,科学,疾病医疗,个人生活,安全和通用。覆盖数百个使用场景\n- NER:中文覆盖28个实体类型包括人物,地缘政治,组织,身体部位,药物等,英文覆盖130个实体类型包括Animal, Weapon, Conference, Book等。\n- RE:中文覆盖232种关系包括买资,增持,重组,国籍,别名,亲属,入股,转让,导致,发生地点,制造商等,英文覆盖236种关系包括founded by,state or province of headquarters,employee of,occupation,creator等。\n- EE:中文覆盖84种事件类型,包括中标,高管变动,产品行为-发布,公司上市等,和203种论元,英文覆盖45种事件类型,包括Born, Demonstrate, Meet, End Organization, Divorce等,和62种论元。\n\nIn the corpus of over a million entries, 54% are in Chinese and 46% in English. The dataset encompasses 12 fields including finance, society, biology, business, industrial manufacturing, chemistry, vehicles, science, disease and medicine, personal life, security, and general topics, covering hundreds of scenarios:\n\n- NER: In Chinese, it covers 28 types of entities including individuals, geopolitics, organizations, body parts, drugs, etc., while in English, it covers 130 types of entities such as Animals, Weapons, Conferences, Books, etc.\n- RE: In Chinese, it includes 232 types of relations like acquisitions, stake increases, restructurings, nationality, aliases, relatives, buying shares, transfers, causes, locations of occurrence, manufacturers, etc., and in English, 236 types of relations such as founded by, state or province of headquarters, employee of, occupation, creator, etc.\n- EE: Chinese covers 84 types of events including winning a bid, executive changes, product actions - launches, company listings, etc., and 203 types of arguments, whereas English covers 45 types of events such as Birth, Demonstration, Meeting, End of Organization, Divorce, etc., and 62 types of arguments.\n\n!数据分布"
] | [
35,
541
] | [
"passage: TAGS\n#size_categories-1M<n<10M #language-Chinese #language-English #license-apache-2.0 #region-us \n"
] |
947d5f5fcaa2d2cddc8fa8ca26820975f9df3d74 | # Dataset Card for "SolFuncs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | Pipper/SolFuncs | [
"region:us"
] | 2023-12-13T09:09:43+00:00 | {"dataset_info": {"features": [{"name": "file_name", "dtype": "string"}, {"name": "comments", "dtype": "string"}, {"name": "code_string", "dtype": "string"}, {"name": "__index_level_0__", "dtype": "int64"}], "splits": [{"name": "train", "num_bytes": 15153841436.646095, "num_examples": 591941}, {"name": "test", "num_bytes": 1894239779.676952, "num_examples": 73993}, {"name": "valid", "num_bytes": 1894239779.676952, "num_examples": 73993}], "download_size": 5432099769, "dataset_size": 18942320996.0}} | 2023-12-14T06:30:10+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "SolFuncs"
More Information needed | [
"# Dataset Card for \"SolFuncs\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"SolFuncs\"\n\nMore Information needed"
] | [
6,
13
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"SolFuncs\"\n\nMore Information needed"
] |
d3ed97b106d926c4eb1b7b8b815e694a04ea679b | <p><a href="https://myhealthfitnessmart.blogspot.com/2023/12/serolean-reviews-usa-canada-is-it.html"><strong>SeroLean</strong></a> is a high-value multiple-parts weight loss supplement that has helped over 24,000 people across the country. Its usefulness and natural mix are unmatched in the industry and are your best bet at losing weight in the healthiest way possible.Crafted by curating a variety of natural ingredients, you will start experiencing the results within a month, and that too while avoiding any kind of negative adverse effects. Boasting a high value-for-money equation, <a href="https://serolean-usa-canada.clubeo.com/calendar/2023/12/14/serolean-reviews-important-information-exposed-know-this-before-buy"><strong>SeroLean</strong></a> is your ultimate weight loss partner if you are facing a budget crunch.</p>
<h2><span style="background-color: yellow; color: red;"><a style="background-color: yellow; color: red;" href="https://www.globalfitnessmart.com/get-serolean"><strong>Click Here -- Official Website -- Order Now</strong></a></span></h2>
<h2><strong>➡️<span style="color: blue;">● For Order Official Website - <a style="color: blue;" href="https://www.globalfitnessmart.com/get-serolean">https://www.globalfitnessmart.com/get-serolean</a></span></strong><br /><strong>➡️<span style="color: #ff6600;">● Item Name: — <a style="color: #ff6600;" href="https://www.globalfitnessmart.com/get-serolean">SeroLean</a></span></strong><br /><strong>➡️<span style="color: green;">● Ingredients: — All Natural</span></strong><br /><span style="color: green;"><strong>➡️<span style="color: magenta;">● Incidental Effects: — NA</span></strong></span><br /><strong>➡️<span style="color: maroon;">● Accessibility: — <a style="color: maroon;" href="https://www.globalfitnessmart.com/get-serolean">Online</a></span></strong></h2>
<h2><span style="background-color: yellow; color: red;"><strong>✅HUGE DISCOUNT ! HURRY UP! ORDER NOW!✅</strong></span><br /><span style="background-color: yellow; color: red;"><a style="background-color: yellow; color: red;" href="https://www.globalfitnessmart.com/get-serolean"><strong>✅HUGE DISCOUNT ! HURRY UP! ORDER NOW!✅</strong></a></span><br /><span style="background-color: yellow; color: red;"><a style="background-color: yellow; color: red;" href="https://www.globalfitnessmart.com/get-serolean"><strong>✅HUGE DISCOUNT ! HURRY UP! ORDER NOW!✅</strong></a></span></h2>
<h2><strong>What is <a href="https://serolean-usa-canada.clubeo.com/page/serolean-reviews-usa-canada-is-it-the-ultimate-weight-loss-supplement.html">SeroLean</a>? Explain Briefly!</strong></h2>
<p>Generally, <a href="https://serolean-usa-canada.clubeo.com/page/serolean-reviews-is-it-scam-or-trusted-read-ingredients.html"><strong>SeroLean</strong></a> is a natural weight loss product that has been formulated to support natural and healthy weight loss. Unlike other products in the market, <a href="https://serolean-usa-canada.clubeo.com/"><strong>SeroLean</strong> </a>concentrates on offering a holistic approach to weight reduction by dealing with multiple factors that lead to weight gain.</p>
<p>It mainly helps to improve metabolism and enhance energy levels. Control appetite, and encourage a positive mood. All of these benefits can be obtained by using this product.</p>
<p>This incredible formula is designed under safe standards that allow safe outcomes for the users. This distinctive formulation of the product is formulated to encourage the production of serotonin in the body. In addition, these compact capsules can offer effective and efficient outcomes in weight loss by maintaining weight gain factors in the entire body.</p>
<h2 style="text-align: center;"><span style="color: #0000ff; background-color: #ffcc00;"><a style="color: #0000ff; background-color: #ffcc00;" href="https://www.globalfitnessmart.com/get-serolean"><strong>(EXCLUSIVE OFFER)Click Here : "SeroLean USA & CANADA"Official Website!</strong></a></span></h2>
<h2><strong>How Does <a href="https://knownet.com/question/serolean-reviews-is-it-scam-or-trusted-read-ingredients/">SeroLean</a> Work To Induce Healthy Weight Loss?</strong></h2>
<p><a href="https://groups.google.com/g/serolean-usa/c/laLY0wVqaCM"><strong>SeroLean</strong></a> works through its potent weight loss formula that focuses on inducing weight loss by bringing balance to your serotonin levels. The effectiveness of <a href="https://sites.google.com/view/serolean-review-usa/home"><strong>SeroLean</strong></a> lies in the unique and natural blend of natural components that work ideally to encourage weight loss. This organic product works with adequate nutrients to kick out serotonin levels which is known as a “feel-good” neurotransmitter.</p>
<p>This neurotransmitter plays an important role in stimulating appetite, sleep, and mood. The manufacturer of <a href="https://colab.research.google.com/drive/1CE9Deh8bgeBwGLim_qGIcPJdvIC1y8x6"><strong>SeroLean</strong></a> prepares an amazing and unique solution that can regulate this neurotransmitter and harness the energy of serotonin to enhance emotional well-being. In this way, it potentially helps in weight management.</p>
<p>Utilizing these capsules continuously may help users to obtain healthy weight loss outcomes and they may also meet amazing health advantages. <a href="https://serolean-review.company.site/"><strong>SeroLean</strong></a> directly deals with the key areas that lead to weight gain and trouble in reducing weight. Here are some important roles of this product that promote weight loss.</p>
<p><strong>Enhancing metabolism</strong> – It is a primary task of <a href="https://lookerstudio.google.com/u/0/reporting/d95ec4ec-d7b6-495e-b15c-4d4f15a22f7e/page/TX7kD"><strong>SeroLean</strong></a> that allows your body to burn excessive calories and fat from the body. It makes it easier to reduce weight while at rest.</p>
<p><strong>Control Appetite –</strong> It is the biggest challenge and it acts efficiently to suppress appetite and make everyone stick to a calorie-restricted diet.</p>
<p><strong>Contains Fat-Burning Extracts –</strong> SeroLean contains only natural and fat-burning extracts that increase the thermogenesis process. It is an important process by which your body regulates calories and heat. By improving thermogenesis, these capsules help in weight loss and help in achieving weight loss goals.</p>
<h2><strong>What Are The Multiple Weight Loss Benefits Of Consuming <a href="https://serolean-review.company.site/">SeroLean</a>?</strong></h2>
<p><strong>Reduce hunger- c</strong>ravings and overeating can cause extra weight. Serotonin controls hunger by telling the brain that your stomach is full. SeroLean helps your mind know that you no longer need extra food, leading to fat burning.</p>
<p><strong>Increase serotonin production</strong>- SeroLean mainly targets your serotonin levels in the brain. It stimulates serotonin production, which controls mood, appetite, sleep, energy, cognition, and digestion. High serotonin levels help you attain a healthy weight.</p>
<p><strong>Enhance metabolic rate</strong>- SeroLean supplement stimulates metabolism, enabling you to break down excess fat even at rest. It provides a healthy balance of serotonin, which ensures a speedy metabolism.</p>
<p><strong>Improve sleep quality-</strong> the fat burner contains serotonin-boosting components that raise your serotonin levels, directly impacting your sleep levels. SeroLean PM provides calming effects and lessens stress and anxiety. People who use SeroLean have reported less sleep disturbance.</p>
<p><strong>Regulate mood- </strong>serotonin is a feel-good hormone that promotes mood and happiness. SeroLean elevates your mood and brightens your day. It increases motivation, productivity, optimism, and positivity.</p>
<p><strong>Boost energy levels-</strong> while you burn extra fat and carbs, your body releases energy. SeroLean deals with fatigue and low energy levels. It lets you wake up feeling refreshed and rejuvenated to conquer the day.</p>
<p><strong>Lower stress and anxiety-</strong> anxiety and worry can cause emotional eating and weight gain. SeroLean works by reducing cortisol levels and, in exchange, triggers serotonin production, which helps calm the mind and eliminate stress, worry, and depression.</p>
<p><strong>Reduce carbs and fats absorption-</strong> carbs and fats are the main culprits in weight gain. The active ingredients in SeroLean support breaking down fats and carbs for energy. The fat burner also prevents carbs and fat absorption and unpacks the stubborn fat in the body.</p>
<p><strong>Regulate blood sugar and blood pressure levels-</strong> unhealthy blood pressure and blood sugar go hand in hand with unhealthy weight. SeroLean helps with weight management, which means healthy blood sugar and blood pressure levels.</p>
<h2 style="text-align: center;"><span style="color: #0000ff; background-color: #ffcc00;"><a style="color: #0000ff; background-color: #ffcc00;" href="https://www.globalfitnessmart.com/get-serolean"><strong>SPECIAL PROMO[Limited Discount]: "SeroLean USA & CANADA"Official Website!</strong></a></span></h2>
<h2><strong>What Are The Natural Ingredients In SeroLean?</strong></h2>
<p><strong>SeroLean is a weight-</strong>management product that was developed using premium components. Each capsule includes a synergistic blend of the purest plant ingredients available from the plant and nature. In this part, we have explored the advantages of primary extracts used to make SeroLean products:</p>
<p><strong>Green Tea Extract</strong></p>
<p>This ingredient has been utilized for a long time to assist with weight reduction along with mental sharpness. Additionally, it facilitates post-workout recovery by managing stable blood sugar levels in the body. By using this formula appropriately, you can get boosted thermogenesis and metabolism. This allows users to burn excessive calories for the entire day.</p>
<p><strong>Extract of Saffron</strong></p>
<p>This is a rare Indian plant that increases serotonin in the brain and decreases food cravings. Usually, saffron is enriched with antioxidants and this plant is so popular because of its potential to gain both focus and libido.</p>
<p><strong>Nettle Root</strong></p>
<p>Some antioxidants can be found in stinging nettle leaves which may be used to deal with inflammation. Advantages include a return to optimal levels in energy levels, immunity, blood pressure, and the reaction of your body to stress.</p>
<p><strong>5-HTTP</strong></p>
<p>This ingredient helps to raise serotonin levels and induces tranquility in the mind. In addition, the combination depends on the appetite-suppressing qualities of serotonin. Hence, this extract is so crucial that it helps in reducing weight properly.</p>
<p><strong>Vitamin B6</strong></p>
<p>It is essential especially for the conversion of 5-HTTP to serotonin which helps to enhance and stabilize mood. In general, breakfast will satisfy the appetite level and prevent snacking during the day.</p>
<p><strong>L-Theanine</strong></p>
<p>It is a kind of amino acid that has been proven to enhance mental capacity. It has been related to less stress and anxiety, deeper sleep, and a more positive outlook. Additionally, the stress in both mind and body will melt away because of its powerful amino acid.</p>
<p><strong>Ashwagandha</strong></p>
<p>Its adapt-genic properties contain a decline in cortisol. Usually, the stress hormone cortisol interferes with the production of serotonin which may decrease overeating.</p>
<p><strong>L-Carnitine</strong></p>
<p>This ingredient is essential during the process of converting digested food and converts them into beneficial energy. The skeletal muscle, the kidneys, and the liver all generate it naturally without having any trouble.</p>
<p><strong>White-Bean Powdered</strong></p>
<p>It restricts the quantity of calories your body takes in from consuming. Also, it regulates the generation of glucose-converting enzymes in the entire body. You will feel less hunger and may have greater control of diabetes or blood sugar.</p>
<h2><strong>Pros of SeroLean:</strong></h2>
<ul>
<li>SeroLean harnesses the power of all-natural ingredients to stimulate serotonin levels in your body, making it a potent tool for shedding excess fat.</li>
<li>Experience tranquil evenings and restful sleep with this calming two-part blend while supporting your body's metabolism and fat burning throughout the night.</li>
<li>Each morning, you'll awaken feeling revitalized, re-energized, and ready to kickstart your body's metabolic processes again.</li>
<li>Proper use of SeroLean, following the easy AM-PM daily schedule, can be a life-changing experience, helping you attain the attractive figure you've always desired.<br /> With SeroLean, expect results that endure over time, allowing you to lead a better, healthier, and slimmer life.</li>
<li>The nutritional complex within SeroLean contributes to elevated serotonin levels, effectively curbing hunger and emotional overeating throughout the day.</li>
<li>Beyond weight management, SeroLean supports a healthy inflammatory response, maintains optimal cholesterol levels, reduces occasional stress and anxiety, relieves occasional aches, and provides joint comfort.</li>
<li>The natural components in this SeroLean formula extend their benefits to promote improved sleep and enhance the health of your skin and hair.</li>
<li>It can also boost brain function and elevate your overall health, well-being, and quality of life.</li>
</ul>
<h2><strong>Cons of SeroLean:</strong></h2>
<p>SeroLean is exclusively accessible online, making it unavailable for purchase through offline methods.</p>
<p>Before SeroLean consumption, it is strongly recommended to carefully review the product label for instructions and important information.</p>
<p>When incorporating SeroLean supplement into your routine, it is advisable to follow the dosage recommendations provided by your healthcare provider for optimal results.</p>
<h2 style="text-align: center;"><span style="color: #0000ff; background-color: #ffcc00;"><a style="color: #0000ff; background-color: #ffcc00;" href="https://www.globalfitnessmart.com/get-serolean"><strong>SPECIAL PROMO: Get SeroLean at the Lowest Discounted Price Online</strong></a></span></h2>
<h2><strong>The Science Behind SeroLean</strong></h2>
<p>Dr. Posner's extensive research on weight loss and gain led him to identify serotonin imbalance as a significant obstacle to achieving sustainable weight loss. Serotonin, often referred to as the "feel-good" hormone, plays a crucial role in regulating appetite, sleep, mood, and behavior. Low levels of serotonin can lead to increased cravings for sugary and carb-filled snacks, making it challenging to maintain a healthy diet.</p>
<p>To address this issue, SeroLean incorporates an ingredient called 5-HTP, which increases serotonin synthesis in the body. This, combined with other fat-burning ingredients, helps individuals lose weight more easily with frequent usage. By boosting serotonin levels and controlling cravings, SeroLean aids in restoring balance to the brain and promoting healthy weight loss.</p>
<h2><strong>How Should Consume SeroLean For Getting Optimal Outcomes?</strong></h2>
<p>The manufacturer recommends taking 2 capsules in the morning bottle and two capsules from the evening bottle twice daily. Make sure to use this product consistently for at least 3 months to get the full advantages of SeroLean.</p>
<p>In addition to using this product, a low-carb diet and continuous physical workout are suggested. When put together, these precautionary measures will kick out your weight loss journey to maintaining and reaching a healthy weight.</p>
<h2><strong>Pricing and Availability</strong></h2>
<p>SeroLean can only be purchased from the official website to ensure the authenticity of the product. The supplement is available in three packages:</p>
<ul>
<li><strong>One-Month Supply: $59 per bottle.</strong></li>
<li><strong>Three-Month Supply: $49 per bottle, with an extra SeroLean PM bottle included.</strong></li>
<li><strong>Six-Month Supply: $39 per bottle, with two extra SeroLean PM bottles included.</strong></li>
</ul>
<h2><strong>Conclusion</strong></h2>
<p>SeroLean is a breakthrough weight loss supplement designed to address the underlying causes of uncontrollable hunger and cravings. By optimizing neurotransmitters and hormones related to appetite regulation, SeroLean helps individuals regain control over their eating habits, leading to healthy weight loss. With its proprietary blend of natural ingredients, SeroLean offers a safe and effective solution for those seeking to achieve their weight loss goals. If you're looking for a reliable weight loss supplement backed by scientific research and positive customer reviews, SeroLean may be the right choice for you.</p>
<h2 style="text-align: center;"><span style="color: #0000ff; background-color: #ffcc00;"><a style="color: #0000ff; background-color: #ffcc00;" href="https://www.globalfitnessmart.com/get-serolean"><strong>Exclusive Details: *SeroLean* Read More Details on Official Website USA & CANADA!</strong></a></span></h2>
<h2><strong># READ MORE INFO-</strong></h2>
<p><strong><a href="https://knownet.com/question/serolean-reviews-usa-canada-is-it-the-ultimate-weight-loss-supplement/">https://knownet.com/question/serolean-reviews-usa-canada-is-it-the-ultimate-weight-loss-supplement/</a></strong></p>
<p><strong><a href="https://serolean-usa-canada.clubeo.com/calendar/2023/12/14/serolean-reviews-important-information-exposed-know-this-before-buy">https://serolean-usa-canada.clubeo.com/calendar/2023/12/14/serolean-reviews-important-information-exposed-know-this-before-buy</a></strong></p>
<p><strong><a href="https://serolean-usa-canada.clubeo.com/page/serolean-reviews-usa-canada-is-it-the-ultimate-weight-loss-supplement.html">https://serolean-usa-canada.clubeo.com/page/serolean-reviews-usa-canada-is-it-the-ultimate-weight-loss-supplement.html</a></strong></p>
<p><strong><a href="https://serolean-usa-canada.clubeo.com/page/serolean-reviews-is-it-scam-or-trusted-read-ingredients.html">https://serolean-usa-canada.clubeo.com/page/serolean-reviews-is-it-scam-or-trusted-read-ingredients.html</a></strong></p>
<p><strong><a href="https://serolean-usa-canada.clubeo.com/">https://serolean-usa-canada.clubeo.com/</a></strong></p>
<p><strong><a href="https://groups.google.com/g/serolean-usa/c/laLY0wVqaCM">https://groups.google.com/g/serolean-usa/c/laLY0wVqaCM</a></strong></p>
<p><strong><a href="https://myhealthfitnessmart.blogspot.com/2023/12/serolean-reviews-usa-canada-is-it.html">https://myhealthfitnessmart.blogspot.com/2023/12/serolean-reviews-usa-canada-is-it.html</a></strong></p>
<p><strong><a href="https://sites.google.com/view/serolean-review-usa/home">https://sites.google.com/view/serolean-review-usa/home</a><br /></strong></p>
<p><strong><a href="https://colab.research.google.com/drive/1CE9Deh8bgeBwGLim_qGIcPJdvIC1y8x6">https://colab.research.google.com/drive/1CE9Deh8bgeBwGLim_qGIcPJdvIC1y8x6</a></strong></p>
<p><strong><a href="https://lookerstudio.google.com/u/0/reporting/d95ec4ec-d7b6-495e-b15c-4d4f15a22f7e/page/TX7kD">https://lookerstudio.google.com/u/0/reporting/d95ec4ec-d7b6-495e-b15c-4d4f15a22f7e/page/TX7kD</a></strong></p>
<p><strong><a href="https://serolean-review.company.site/">https://serolean-review.company.site/</a></strong></p>
<p><strong><a href="https://huggingface.co/seroleanofficial/serolean/blob/main/README.md">https://huggingface.co/seroleanofficial/serolean/blob/main/README.md</a></strong></p> | seroleanofficial/serolean-review | [
"region:us"
] | 2023-12-13T09:13:53+00:00 | {} | 2023-12-13T09:14:16+00:00 | [] | [] | TAGS
#region-us
| <p><a href="URL is a high-value multiple-parts weight loss supplement that has helped over 24,000 people across the country. Its usefulness and natural mix are unmatched in the industry and are your best bet at losing weight in the healthiest way possible.Crafted by curating a variety of natural ingredients, you will start experiencing the results within a month, and that too while avoiding any kind of negative adverse effects. Boasting a high value-for-money equation, <a href="URL is your ultimate weight loss partner if you are facing a budget crunch.</p>
<h2><span style="background-color: yellow; color: red;"><a style="background-color: yellow; color: red;" href="URL Here -- Official Website -- Order Now</strong></a></span></h2>
<h2><strong>️<span style="color: blue;">● For Order Official Website - <a style="color: blue;" href="URL/URL /><strong>️<span style="color: #ff6600;">● Item Name: — <a style="color: #ff6600;" href="URL /><strong>️<span style="color: green;">● Ingredients: — All Natural</span></strong><br /><span style="color: green;"><strong>️<span style="color: magenta;">● Incidental Effects: — NA</span></strong></span><br /><strong>️<span style="color: maroon;">● Accessibility: — <a style="color: maroon;" href="URL
<h2><span style="background-color: yellow; color: red;"><strong>HUGE DISCOUNT ! HURRY UP! ORDER NOW!</strong></span><br /><span style="background-color: yellow; color: red;"><a style="background-color: yellow; color: red;" href="URL DISCOUNT ! HURRY UP! ORDER NOW!</strong></a></span><br /><span style="background-color: yellow; color: red;"><a style="background-color: yellow; color: red;" href="URL DISCOUNT ! HURRY UP! ORDER NOW!</strong></a></span></h2>
<h2><strong>What is <a href="URL Explain Briefly!</strong></h2>
<p>Generally, <a href="URL is a natural weight loss product that has been formulated to support natural and healthy weight loss. Unlike other products in the market, <a href="URL </a>concentrates on offering a holistic approach to weight reduction by dealing with multiple factors that lead to weight gain.</p>
<p>It mainly helps to improve metabolism and enhance energy levels. Control appetite, and encourage a positive mood. All of these benefits can be obtained by using this product.</p>
<p>This incredible formula is designed under safe standards that allow safe outcomes for the users. This distinctive formulation of the product is formulated to encourage the production of serotonin in the body. In addition, these compact capsules can offer effective and efficient outcomes in weight loss by maintaining weight gain factors in the entire body.</p>
<h2 style="text-align: center;"><span style="color: #0000ff; background-color: #ffcc00;"><a style="color: #0000ff; background-color: #ffcc00;" href="URL OFFER)Click Here : "SeroLean USA & CANADA"Official Website!</strong></a></span></h2>
<h2><strong>How Does <a href="URL Work To Induce Healthy Weight Loss?</strong></h2>
<p><a href="URL works through its potent weight loss formula that focuses on inducing weight loss by bringing balance to your serotonin levels. The effectiveness of <a href="URL lies in the unique and natural blend of natural components that work ideally to encourage weight loss. This organic product works with adequate nutrients to kick out serotonin levels which is known as a “feel-good” neurotransmitter.</p>
<p>This neurotransmitter plays an important role in stimulating appetite, sleep, and mood. The manufacturer of <a href="URL prepares an amazing and unique solution that can regulate this neurotransmitter and harness the energy of serotonin to enhance emotional well-being. In this way, it potentially helps in weight management.</p>
<p>Utilizing these capsules continuously may help users to obtain healthy weight loss outcomes and they may also meet amazing health advantages. <a href="URL directly deals with the key areas that lead to weight gain and trouble in reducing weight. Here are some important roles of this product that promote weight loss.</p>
<p><strong>Enhancing metabolism</strong> – It is a primary task of <a href="URL that allows your body to burn excessive calories and fat from the body. It makes it easier to reduce weight while at rest.</p>
<p><strong>Control Appetite –</strong> It is the biggest challenge and it acts efficiently to suppress appetite and make everyone stick to a calorie-restricted diet.</p>
<p><strong>Contains Fat-Burning Extracts –</strong> SeroLean contains only natural and fat-burning extracts that increase the thermogenesis process. It is an important process by which your body regulates calories and heat. By improving thermogenesis, these capsules help in weight loss and help in achieving weight loss goals.</p>
<h2><strong>What Are The Multiple Weight Loss Benefits Of Consuming <a href="URL
<p><strong>Reduce hunger- c</strong>ravings and overeating can cause extra weight. Serotonin controls hunger by telling the brain that your stomach is full. SeroLean helps your mind know that you no longer need extra food, leading to fat burning.</p>
<p><strong>Increase serotonin production</strong>- SeroLean mainly targets your serotonin levels in the brain. It stimulates serotonin production, which controls mood, appetite, sleep, energy, cognition, and digestion. High serotonin levels help you attain a healthy weight.</p>
<p><strong>Enhance metabolic rate</strong>- SeroLean supplement stimulates metabolism, enabling you to break down excess fat even at rest. It provides a healthy balance of serotonin, which ensures a speedy metabolism.</p>
<p><strong>Improve sleep quality-</strong> the fat burner contains serotonin-boosting components that raise your serotonin levels, directly impacting your sleep levels. SeroLean PM provides calming effects and lessens stress and anxiety. People who use SeroLean have reported less sleep disturbance.</p>
<p><strong>Regulate mood- </strong>serotonin is a feel-good hormone that promotes mood and happiness. SeroLean elevates your mood and brightens your day. It increases motivation, productivity, optimism, and positivity.</p>
<p><strong>Boost energy levels-</strong> while you burn extra fat and carbs, your body releases energy. SeroLean deals with fatigue and low energy levels. It lets you wake up feeling refreshed and rejuvenated to conquer the day.</p>
<p><strong>Lower stress and anxiety-</strong> anxiety and worry can cause emotional eating and weight gain. SeroLean works by reducing cortisol levels and, in exchange, triggers serotonin production, which helps calm the mind and eliminate stress, worry, and depression.</p>
<p><strong>Reduce carbs and fats absorption-</strong> carbs and fats are the main culprits in weight gain. The active ingredients in SeroLean support breaking down fats and carbs for energy. The fat burner also prevents carbs and fat absorption and unpacks the stubborn fat in the body.</p>
<p><strong>Regulate blood sugar and blood pressure levels-</strong> unhealthy blood pressure and blood sugar go hand in hand with unhealthy weight. SeroLean helps with weight management, which means healthy blood sugar and blood pressure levels.</p>
<h2 style="text-align: center;"><span style="color: #0000ff; background-color: #ffcc00;"><a style="color: #0000ff; background-color: #ffcc00;" href="URL PROMO[Limited Discount]: "SeroLean USA & CANADA"Official Website!</strong></a></span></h2>
<h2><strong>What Are The Natural Ingredients In SeroLean?</strong></h2>
<p><strong>SeroLean is a weight-</strong>management product that was developed using premium components. Each capsule includes a synergistic blend of the purest plant ingredients available from the plant and nature. In this part, we have explored the advantages of primary extracts used to make SeroLean products:</p>
<p><strong>Green Tea Extract</strong></p>
<p>This ingredient has been utilized for a long time to assist with weight reduction along with mental sharpness. Additionally, it facilitates post-workout recovery by managing stable blood sugar levels in the body. By using this formula appropriately, you can get boosted thermogenesis and metabolism. This allows users to burn excessive calories for the entire day.</p>
<p><strong>Extract of Saffron</strong></p>
<p>This is a rare Indian plant that increases serotonin in the brain and decreases food cravings. Usually, saffron is enriched with antioxidants and this plant is so popular because of its potential to gain both focus and libido.</p>
<p><strong>Nettle Root</strong></p>
<p>Some antioxidants can be found in stinging nettle leaves which may be used to deal with inflammation. Advantages include a return to optimal levels in energy levels, immunity, blood pressure, and the reaction of your body to stress.</p>
<p><strong>5-HTTP</strong></p>
<p>This ingredient helps to raise serotonin levels and induces tranquility in the mind. In addition, the combination depends on the appetite-suppressing qualities of serotonin. Hence, this extract is so crucial that it helps in reducing weight properly.</p>
<p><strong>Vitamin B6</strong></p>
<p>It is essential especially for the conversion of 5-HTTP to serotonin which helps to enhance and stabilize mood. In general, breakfast will satisfy the appetite level and prevent snacking during the day.</p>
<p><strong>L-Theanine</strong></p>
<p>It is a kind of amino acid that has been proven to enhance mental capacity. It has been related to less stress and anxiety, deeper sleep, and a more positive outlook. Additionally, the stress in both mind and body will melt away because of its powerful amino acid.</p>
<p><strong>Ashwagandha</strong></p>
<p>Its adapt-genic properties contain a decline in cortisol. Usually, the stress hormone cortisol interferes with the production of serotonin which may decrease overeating.</p>
<p><strong>L-Carnitine</strong></p>
<p>This ingredient is essential during the process of converting digested food and converts them into beneficial energy. The skeletal muscle, the kidneys, and the liver all generate it naturally without having any trouble.</p>
<p><strong>White-Bean Powdered</strong></p>
<p>It restricts the quantity of calories your body takes in from consuming. Also, it regulates the generation of glucose-converting enzymes in the entire body. You will feel less hunger and may have greater control of diabetes or blood sugar.</p>
<h2><strong>Pros of SeroLean:</strong></h2>
<ul>
<li>SeroLean harnesses the power of all-natural ingredients to stimulate serotonin levels in your body, making it a potent tool for shedding excess fat.</li>
<li>Experience tranquil evenings and restful sleep with this calming two-part blend while supporting your body's metabolism and fat burning throughout the night.</li>
<li>Each morning, you'll awaken feeling revitalized, re-energized, and ready to kickstart your body's metabolic processes again.</li>
<li>Proper use of SeroLean, following the easy AM-PM daily schedule, can be a life-changing experience, helping you attain the attractive figure you've always desired.<br /> With SeroLean, expect results that endure over time, allowing you to lead a better, healthier, and slimmer life.</li>
<li>The nutritional complex within SeroLean contributes to elevated serotonin levels, effectively curbing hunger and emotional overeating throughout the day.</li>
<li>Beyond weight management, SeroLean supports a healthy inflammatory response, maintains optimal cholesterol levels, reduces occasional stress and anxiety, relieves occasional aches, and provides joint comfort.</li>
<li>The natural components in this SeroLean formula extend their benefits to promote improved sleep and enhance the health of your skin and hair.</li>
<li>It can also boost brain function and elevate your overall health, well-being, and quality of life.</li>
</ul>
<h2><strong>Cons of SeroLean:</strong></h2>
<p>SeroLean is exclusively accessible online, making it unavailable for purchase through offline methods.</p>
<p>Before SeroLean consumption, it is strongly recommended to carefully review the product label for instructions and important information.</p>
<p>When incorporating SeroLean supplement into your routine, it is advisable to follow the dosage recommendations provided by your healthcare provider for optimal results.</p>
<h2 style="text-align: center;"><span style="color: #0000ff; background-color: #ffcc00;"><a style="color: #0000ff; background-color: #ffcc00;" href="URL PROMO: Get SeroLean at the Lowest Discounted Price Online</strong></a></span></h2>
<h2><strong>The Science Behind SeroLean</strong></h2>
<p>Dr. Posner's extensive research on weight loss and gain led him to identify serotonin imbalance as a significant obstacle to achieving sustainable weight loss. Serotonin, often referred to as the "feel-good" hormone, plays a crucial role in regulating appetite, sleep, mood, and behavior. Low levels of serotonin can lead to increased cravings for sugary and carb-filled snacks, making it challenging to maintain a healthy diet.</p>
<p>To address this issue, SeroLean incorporates an ingredient called 5-HTP, which increases serotonin synthesis in the body. This, combined with other fat-burning ingredients, helps individuals lose weight more easily with frequent usage. By boosting serotonin levels and controlling cravings, SeroLean aids in restoring balance to the brain and promoting healthy weight loss.</p>
<h2><strong>How Should Consume SeroLean For Getting Optimal Outcomes?</strong></h2>
<p>The manufacturer recommends taking 2 capsules in the morning bottle and two capsules from the evening bottle twice daily. Make sure to use this product consistently for at least 3 months to get the full advantages of SeroLean.</p>
<p>In addition to using this product, a low-carb diet and continuous physical workout are suggested. When put together, these precautionary measures will kick out your weight loss journey to maintaining and reaching a healthy weight.</p>
<h2><strong>Pricing and Availability</strong></h2>
<p>SeroLean can only be purchased from the official website to ensure the authenticity of the product. The supplement is available in three packages:</p>
<ul>
<li><strong>One-Month Supply: $59 per bottle.</strong></li>
<li><strong>Three-Month Supply: $49 per bottle, with an extra SeroLean PM bottle included.</strong></li>
<li><strong>Six-Month Supply: $39 per bottle, with two extra SeroLean PM bottles included.</strong></li>
</ul>
<h2><strong>Conclusion</strong></h2>
<p>SeroLean is a breakthrough weight loss supplement designed to address the underlying causes of uncontrollable hunger and cravings. By optimizing neurotransmitters and hormones related to appetite regulation, SeroLean helps individuals regain control over their eating habits, leading to healthy weight loss. With its proprietary blend of natural ingredients, SeroLean offers a safe and effective solution for those seeking to achieve their weight loss goals. If you're looking for a reliable weight loss supplement backed by scientific research and positive customer reviews, SeroLean may be the right choice for you.</p>
<h2 style="text-align: center;"><span style="color: #0000ff; background-color: #ffcc00;"><a style="color: #0000ff; background-color: #ffcc00;" href="URL Details: *SeroLean* Read More Details on Official Website USA & CANADA!</strong></a></span></h2>
<h2><strong># READ MORE INFO-</strong></h2>
<p><strong><a href="URL/URL
<p><strong><a href="URL/URL
<p><strong><a href="URL/URL
<p><strong><a href="URL/URL
<p><strong><a href="URL/URL
<p><strong><a href="URL/URL
<p><strong><a href="URL/URL
<p><strong><a href="URL/URL /></strong></p>
<p><strong><a href="URL/URL
<p><strong><a href="URL/URL
<p><strong><a href="URL/URL
<p><strong><a href="URL/URL | [
"# READ MORE INFO-</strong></h2>\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL /></strong></p>\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL"
] | [
"TAGS\n#region-us \n",
"# READ MORE INFO-</strong></h2>\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL /></strong></p>\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL"
] | [
6,
175
] | [
"passage: TAGS\n#region-us \n# READ MORE INFO-</strong></h2>\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL /></strong></p>\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL\n<p><strong><a href=\"URL/URL"
] |
2595b5217009dd6036b805c7e113dfc7b362ac6e | # Dataset Card for "dailydialogsample-jumble"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dailydialogsample-jumble | [
"region:us"
] | 2023-12-13T09:22:06+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 46955, "num_examples": 100}], "download_size": 36773, "dataset_size": 46955}} | 2023-12-13T09:22:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dailydialogsample-jumble"
More Information needed | [
"# Dataset Card for \"dailydialogsample-jumble\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dailydialogsample-jumble\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dailydialogsample-jumble\"\n\nMore Information needed"
] |
e7d239ca6af4d8f90306c4a211801908e7e96328 | # Dataset Card for "dreamsample-jumble"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/dreamsample-jumble | [
"region:us"
] | 2023-12-13T09:22:10+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 51327, "num_examples": 100}], "download_size": 38617, "dataset_size": 51327}} | 2023-12-13T09:22:13+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "dreamsample-jumble"
More Information needed | [
"# Dataset Card for \"dreamsample-jumble\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"dreamsample-jumble\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"dreamsample-jumble\"\n\nMore Information needed"
] |
baa08e93b9acb10a3e0177ec30705bccda8a6f02 | # Dataset Card for "personachatsample-jumble"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | hlt-lab/personachatsample-jumble | [
"region:us"
] | 2023-12-13T09:22:16+00:00 | {"dataset_info": {"features": [{"name": "context", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "reference", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 36304, "num_examples": 100}], "download_size": 28190, "dataset_size": 36304}} | 2023-12-13T09:22:19+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "personachatsample-jumble"
More Information needed | [
"# Dataset Card for \"personachatsample-jumble\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"personachatsample-jumble\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"personachatsample-jumble\"\n\nMore Information needed"
] |
48933099b9a1d2d6c2408917fe60ae4e30f79ddb |
The preference dataset is derived from the [stack exchange dataset](https://huggingface.co/datasets/HuggingFaceH4/stack-exchange-preferences) which contains questions and answers from the Stack Overflow Data Dump. This contains questions and answers for various topics. For this work, we used only question and answers from [math.stackexchange.com](https://huggingface.co/datasets/HuggingFaceH4/stack-exchange-preferences/tree/main/data/math.meta.stackexchange.com) sub-folder.
The questions are grouped with answers that are assigned a score corresponding to the Anthropic paper:
```
score = log2 (1 + upvotes) rounded to the nearest integer, plus 1 if the answer was accepted by the questioner (we assign a score of −1 if the number of upvotes is negative).
```
We performed following processing to derive the final dataset.
1) Basic pre-processing ([code](https://github.com/PraveenSH/dpo-arithmo-mistral-7B/blob/main/src/data_processing/stack_exchange_data.py)) to clean the text
2) Filter Mathematical question using regex based detector ([code](https://github.com/PraveenSH/dpo-arithmo-mistral-7B/blob/main/src/data_processing/stack_exchange_data.py))
3) For each question, extract 2 answers - one with highest score and one with the lowest score. Former is used as Preferred response and latter is used as the rejected response
## References
```
@online{h4stackexchange,
author = {Lambert, Nathan and Tunstall, Lewis and Rajani, Nazneen and Thrush, Tristan},
title = {HuggingFace H4 Stack Exchange Preference Dataset},
year = 2023,
url = {https://huggingface.co/datasets/HuggingFaceH4/stack-exchange-preferences},
}
``` | prhegde/preference-data-math-stack-exchange | [
"license:apache-2.0",
"region:us"
] | 2023-12-13T10:07:23+00:00 | {"license": "apache-2.0"} | 2023-12-13T10:16:15+00:00 | [] | [] | TAGS
#license-apache-2.0 #region-us
|
The preference dataset is derived from the stack exchange dataset which contains questions and answers from the Stack Overflow Data Dump. This contains questions and answers for various topics. For this work, we used only question and answers from URL sub-folder.
The questions are grouped with answers that are assigned a score corresponding to the Anthropic paper:
We performed following processing to derive the final dataset.
1) Basic pre-processing (code) to clean the text
2) Filter Mathematical question using regex based detector (code)
3) For each question, extract 2 answers - one with highest score and one with the lowest score. Former is used as Preferred response and latter is used as the rejected response
## References
| [
"## References"
] | [
"TAGS\n#license-apache-2.0 #region-us \n",
"## References"
] | [
14,
3
] | [
"passage: TAGS\n#license-apache-2.0 #region-us \n## References"
] |
132875c977232e4ef19940c074f41923dfaa302e |
# Dataset Card for Evaluation run of rishiraj/smol-3b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rishiraj/smol-3b](https://huggingface.co/rishiraj/smol-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rishiraj__smol-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T10:26:39.414520](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__smol-3b/blob/main/results_2023-12-13T10-26-39.414520.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4630403817848583,
"acc_stderr": 0.03471311735087895,
"acc_norm": 0.467024043140488,
"acc_norm_stderr": 0.035458188509216476,
"mc1": 0.32313341493268055,
"mc1_stderr": 0.0163718362864546,
"mc2": 0.5073211090135596,
"mc2_stderr": 0.015470937650792245
},
"harness|arc:challenge|25": {
"acc": 0.42662116040955633,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.46331058020477817,
"acc_norm_stderr": 0.014572000527756998
},
"harness|hellaswag|10": {
"acc": 0.510157339175463,
"acc_stderr": 0.004988751698341138,
"acc_norm": 0.6823341963752241,
"acc_norm_stderr": 0.004646172373100999
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.031068985963122145,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.031068985963122145
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2804232804232804,
"acc_stderr": 0.023135287974325625,
"acc_norm": 0.2804232804232804,
"acc_norm_stderr": 0.023135287974325625
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.532258064516129,
"acc_stderr": 0.02838474778881333,
"acc_norm": 0.532258064516129,
"acc_norm_stderr": 0.02838474778881333
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6060606060606061,
"acc_stderr": 0.03815494308688931,
"acc_norm": 0.6060606060606061,
"acc_norm_stderr": 0.03815494308688931
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6373056994818653,
"acc_stderr": 0.03469713791704371,
"acc_norm": 0.6373056994818653,
"acc_norm_stderr": 0.03469713791704371
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4282051282051282,
"acc_stderr": 0.025088301454694827,
"acc_norm": 0.4282051282051282,
"acc_norm_stderr": 0.025088301454694827
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097845,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097845
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.618348623853211,
"acc_stderr": 0.020828148517022596,
"acc_norm": 0.618348623853211,
"acc_norm_stderr": 0.020828148517022596
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6244725738396625,
"acc_stderr": 0.03152256243091156,
"acc_norm": 0.6244725738396625,
"acc_norm_stderr": 0.03152256243091156
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4663677130044843,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.4663677130044843,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5343511450381679,
"acc_stderr": 0.043749285605997376,
"acc_norm": 0.5343511450381679,
"acc_norm_stderr": 0.043749285605997376
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.04465869780531009,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.04465869780531009
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5705521472392638,
"acc_stderr": 0.03889066619112723,
"acc_norm": 0.5705521472392638,
"acc_norm_stderr": 0.03889066619112723
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6965811965811965,
"acc_stderr": 0.03011821010694265,
"acc_norm": 0.6965811965811965,
"acc_norm_stderr": 0.03011821010694265
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5351213282247765,
"acc_stderr": 0.017835798806290642,
"acc_norm": 0.5351213282247765,
"acc_norm_stderr": 0.017835798806290642
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49710982658959535,
"acc_stderr": 0.02691864538323901,
"acc_norm": 0.49710982658959535,
"acc_norm_stderr": 0.02691864538323901
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5294117647058824,
"acc_stderr": 0.0285803410651383,
"acc_norm": 0.5294117647058824,
"acc_norm_stderr": 0.0285803410651383
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5080385852090032,
"acc_stderr": 0.028394421370984538,
"acc_norm": 0.5080385852090032,
"acc_norm_stderr": 0.028394421370984538
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.027777777777777797,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.027777777777777797
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509317,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509317
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.37353324641460234,
"acc_stderr": 0.012354994823515267,
"acc_norm": 0.37353324641460234,
"acc_norm_stderr": 0.012354994823515267
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3639705882352941,
"acc_stderr": 0.029227192460032025,
"acc_norm": 0.3639705882352941,
"acc_norm_stderr": 0.029227192460032025
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.02010258389588718,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.02010258389588718
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5673469387755102,
"acc_stderr": 0.03171752824062664,
"acc_norm": 0.5673469387755102,
"acc_norm_stderr": 0.03171752824062664
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6218905472636815,
"acc_stderr": 0.03428867848778658,
"acc_norm": 0.6218905472636815,
"acc_norm_stderr": 0.03428867848778658
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03811079669833531,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03811079669833531
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32313341493268055,
"mc1_stderr": 0.0163718362864546,
"mc2": 0.5073211090135596,
"mc2_stderr": 0.015470937650792245
},
"harness|winogrande|5": {
"acc": 0.6535122336227308,
"acc_stderr": 0.013373773411685642
},
"harness|gsm8k|5": {
"acc": 0.24639878695981804,
"acc_stderr": 0.011869498557755346
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_rishiraj__smol-3b | [
"region:us"
] | 2023-12-13T10:29:34+00:00 | {"pretty_name": "Evaluation run of rishiraj/smol-3b", "dataset_summary": "Dataset automatically created during the evaluation run of model [rishiraj/smol-3b](https://huggingface.co/rishiraj/smol-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rishiraj__smol-3b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T10:26:39.414520](https://huggingface.co/datasets/open-llm-leaderboard/details_rishiraj__smol-3b/blob/main/results_2023-12-13T10-26-39.414520.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4630403817848583,\n \"acc_stderr\": 0.03471311735087895,\n \"acc_norm\": 0.467024043140488,\n \"acc_norm_stderr\": 0.035458188509216476,\n \"mc1\": 0.32313341493268055,\n \"mc1_stderr\": 0.0163718362864546,\n \"mc2\": 0.5073211090135596,\n \"mc2_stderr\": 0.015470937650792245\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.42662116040955633,\n \"acc_stderr\": 0.014453185592920293,\n \"acc_norm\": 0.46331058020477817,\n \"acc_norm_stderr\": 0.014572000527756998\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.510157339175463,\n \"acc_stderr\": 0.004988751698341138,\n \"acc_norm\": 0.6823341963752241,\n \"acc_norm_stderr\": 0.004646172373100999\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.4161849710982659,\n \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.031068985963122145,\n \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.031068985963122145\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2804232804232804,\n \"acc_stderr\": 0.023135287974325625,\n \"acc_norm\": 0.2804232804232804,\n \"acc_norm_stderr\": 0.023135287974325625\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.532258064516129,\n \"acc_stderr\": 0.02838474778881333,\n \"acc_norm\": 0.532258064516129,\n \"acc_norm_stderr\": 0.02838474778881333\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n \"acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6060606060606061,\n \"acc_stderr\": 0.03815494308688931,\n \"acc_norm\": 0.6060606060606061,\n \"acc_norm_stderr\": 0.03815494308688931\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5303030303030303,\n \"acc_stderr\": 0.03555804051763929,\n \"acc_norm\": 0.5303030303030303,\n \"acc_norm_stderr\": 0.03555804051763929\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.6373056994818653,\n \"acc_stderr\": 0.03469713791704371,\n \"acc_norm\": 0.6373056994818653,\n \"acc_norm_stderr\": 0.03469713791704371\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4282051282051282,\n \"acc_stderr\": 0.025088301454694827,\n \"acc_norm\": 0.4282051282051282,\n \"acc_norm_stderr\": 0.025088301454694827\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097845,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097845\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.618348623853211,\n \"acc_stderr\": 0.020828148517022596,\n \"acc_norm\": 0.618348623853211,\n \"acc_norm_stderr\": 0.020828148517022596\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5931372549019608,\n \"acc_stderr\": 0.03447891136353382,\n \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.03447891136353382\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.6244725738396625,\n \"acc_stderr\": 0.03152256243091156,\n \"acc_norm\": 0.6244725738396625,\n \"acc_norm_stderr\": 0.03152256243091156\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4663677130044843,\n \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.4663677130044843,\n \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5343511450381679,\n \"acc_stderr\": 0.043749285605997376,\n \"acc_norm\": 0.5343511450381679,\n \"acc_norm_stderr\": 0.043749285605997376\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6033057851239669,\n \"acc_stderr\": 0.04465869780531009,\n \"acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.04465869780531009\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5705521472392638,\n \"acc_stderr\": 0.03889066619112723,\n \"acc_norm\": 0.5705521472392638,\n \"acc_norm_stderr\": 0.03889066619112723\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6965811965811965,\n \"acc_stderr\": 0.03011821010694265,\n \"acc_norm\": 0.6965811965811965,\n \"acc_norm_stderr\": 0.03011821010694265\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5351213282247765,\n \"acc_stderr\": 0.017835798806290642,\n \"acc_norm\": 0.5351213282247765,\n \"acc_norm_stderr\": 0.017835798806290642\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.49710982658959535,\n \"acc_stderr\": 0.02691864538323901,\n \"acc_norm\": 0.49710982658959535,\n \"acc_norm_stderr\": 0.02691864538323901\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5294117647058824,\n \"acc_stderr\": 0.0285803410651383,\n \"acc_norm\": 0.5294117647058824,\n \"acc_norm_stderr\": 0.0285803410651383\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5080385852090032,\n \"acc_stderr\": 0.028394421370984538,\n \"acc_norm\": 0.5080385852090032,\n \"acc_norm_stderr\": 0.028394421370984538\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.027777777777777797,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.027777777777777797\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509317,\n \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509317\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.37353324641460234,\n \"acc_stderr\": 0.012354994823515267,\n \"acc_norm\": 0.37353324641460234,\n \"acc_norm_stderr\": 0.012354994823515267\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.3639705882352941,\n \"acc_stderr\": 0.029227192460032025,\n \"acc_norm\": 0.3639705882352941,\n \"acc_norm_stderr\": 0.029227192460032025\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.02010258389588718,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.02010258389588718\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5673469387755102,\n \"acc_stderr\": 0.03171752824062664,\n \"acc_norm\": 0.5673469387755102,\n \"acc_norm_stderr\": 0.03171752824062664\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6218905472636815,\n \"acc_stderr\": 0.03428867848778658,\n \"acc_norm\": 0.6218905472636815,\n \"acc_norm_stderr\": 0.03428867848778658\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03811079669833531,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03811079669833531\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32313341493268055,\n \"mc1_stderr\": 0.0163718362864546,\n \"mc2\": 0.5073211090135596,\n \"mc2_stderr\": 0.015470937650792245\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6535122336227308,\n \"acc_stderr\": 0.013373773411685642\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.24639878695981804,\n \"acc_stderr\": 0.011869498557755346\n }\n}\n```", "repo_url": "https://huggingface.co/rishiraj/smol-3b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|arc:challenge|25_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|gsm8k|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hellaswag|10_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T10-26-39.414520.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["**/details_harness|winogrande|5_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T10-26-39.414520.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T10_26_39.414520", "path": ["results_2023-12-13T10-26-39.414520.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T10-26-39.414520.parquet"]}]}]} | 2023-12-13T10:30:17+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of rishiraj/smol-3b
Dataset automatically created during the evaluation run of model rishiraj/smol-3b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T10:26:39.414520(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of rishiraj/smol-3b\n\n\n\nDataset automatically created during the evaluation run of model rishiraj/smol-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T10:26:39.414520(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rishiraj/smol-3b\n\n\n\nDataset automatically created during the evaluation run of model rishiraj/smol-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T10:26:39.414520(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of rishiraj/smol-3b\n\n\n\nDataset automatically created during the evaluation run of model rishiraj/smol-3b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T10:26:39.414520(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
8a6e60051c7d56e7e0760e4ff0796ae97cc00129 |
# Dataset Card for Evaluation run of rwitz2/grindin
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rwitz2/grindin](https://huggingface.co/rwitz2/grindin) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rwitz2__grindin",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T10:32:16.157494](https://huggingface.co/datasets/open-llm-leaderboard/details_rwitz2__grindin/blob/main/results_2023-12-13T10-32-16.157494.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6543257120477071,
"acc_stderr": 0.032083375252123514,
"acc_norm": 0.6543717910762578,
"acc_norm_stderr": 0.03274372504664638,
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.5933596198853605,
"mc2_stderr": 0.015591069923791908
},
"harness|arc:challenge|25": {
"acc": 0.6629692832764505,
"acc_stderr": 0.013813476652902274,
"acc_norm": 0.6988054607508533,
"acc_norm_stderr": 0.01340674176784764
},
"harness|hellaswag|10": {
"acc": 0.6933877713602868,
"acc_stderr": 0.004601446124041573,
"acc_norm": 0.8702449711212906,
"acc_norm_stderr": 0.003353469625027664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7169811320754716,
"acc_stderr": 0.027724236492700918,
"acc_norm": 0.7169811320754716,
"acc_norm_stderr": 0.027724236492700918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4417989417989418,
"acc_stderr": 0.025576257061253833,
"acc_norm": 0.4417989417989418,
"acc_norm_stderr": 0.025576257061253833
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790482,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790482
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033477,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033477
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083008,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083008
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121622,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121622
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848043,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621133,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621133
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39553072625698327,
"acc_stderr": 0.016353415410075775,
"acc_norm": 0.39553072625698327,
"acc_norm_stderr": 0.016353415410075775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.029752389657427047,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.029752389657427047
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038915,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038915
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.0286619962023353,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.0286619962023353
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482706,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4394124847001224,
"mc1_stderr": 0.017374520482513707,
"mc2": 0.5933596198853605,
"mc2_stderr": 0.015591069923791908
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510429
},
"harness|gsm8k|5": {
"acc": 0.709628506444276,
"acc_stderr": 0.012503592481818957
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_rwitz2__grindin | [
"region:us"
] | 2023-12-13T10:35:05+00:00 | {"pretty_name": "Evaluation run of rwitz2/grindin", "dataset_summary": "Dataset automatically created during the evaluation run of model [rwitz2/grindin](https://huggingface.co/rwitz2/grindin) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rwitz2__grindin\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T10:32:16.157494](https://huggingface.co/datasets/open-llm-leaderboard/details_rwitz2__grindin/blob/main/results_2023-12-13T10-32-16.157494.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543257120477071,\n \"acc_stderr\": 0.032083375252123514,\n \"acc_norm\": 0.6543717910762578,\n \"acc_norm_stderr\": 0.03274372504664638,\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.5933596198853605,\n \"mc2_stderr\": 0.015591069923791908\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6629692832764505,\n \"acc_stderr\": 0.013813476652902274,\n \"acc_norm\": 0.6988054607508533,\n \"acc_norm_stderr\": 0.01340674176784764\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6933877713602868,\n \"acc_stderr\": 0.004601446124041573,\n \"acc_norm\": 0.8702449711212906,\n \"acc_norm_stderr\": 0.003353469625027664\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7169811320754716,\n \"acc_stderr\": 0.027724236492700918,\n \"acc_norm\": 0.7169811320754716,\n \"acc_norm_stderr\": 0.027724236492700918\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4417989417989418,\n \"acc_stderr\": 0.025576257061253833,\n \"acc_norm\": 0.4417989417989418,\n \"acc_norm_stderr\": 0.025576257061253833\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790482,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790482\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033477,\n \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033477\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083008,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083008\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121622,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121622\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848043,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848043\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621133,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621133\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39553072625698327,\n \"acc_stderr\": 0.016353415410075775,\n \"acc_norm\": 0.39553072625698327,\n \"acc_norm_stderr\": 0.016353415410075775\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4645390070921986,\n \"acc_stderr\": 0.029752389657427047,\n \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.029752389657427047\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n \"acc_stderr\": 0.012758410941038915,\n \"acc_norm\": 0.4784876140808344,\n \"acc_norm_stderr\": 0.012758410941038915\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482706,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4394124847001224,\n \"mc1_stderr\": 0.017374520482513707,\n \"mc2\": 0.5933596198853605,\n \"mc2_stderr\": 0.015591069923791908\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510429\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.709628506444276,\n \"acc_stderr\": 0.012503592481818957\n }\n}\n```", "repo_url": "https://huggingface.co/rwitz2/grindin", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|arc:challenge|25_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|gsm8k|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hellaswag|10_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T10-32-16.157494.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["**/details_harness|winogrande|5_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T10-32-16.157494.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T10_32_16.157494", "path": ["results_2023-12-13T10-32-16.157494.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T10-32-16.157494.parquet"]}]}]} | 2023-12-13T10:35:46+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of rwitz2/grindin
Dataset automatically created during the evaluation run of model rwitz2/grindin on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T10:32:16.157494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of rwitz2/grindin\n\n\n\nDataset automatically created during the evaluation run of model rwitz2/grindin on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T10:32:16.157494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of rwitz2/grindin\n\n\n\nDataset automatically created during the evaluation run of model rwitz2/grindin on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T10:32:16.157494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
171,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of rwitz2/grindin\n\n\n\nDataset automatically created during the evaluation run of model rwitz2/grindin on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T10:32:16.157494(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
a2352d70dd29a6e14a0d8241482772d5b852e0db |
# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Llama-1B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [KnutJaegersberg/Walter-Llama-1B](https://huggingface.co/KnutJaegersberg/Walter-Llama-1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__Walter-Llama-1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T10:33:54.615691](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Walter-Llama-1B/blob/main/results_2023-12-13T10-33-54.615691.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27851461271728306,
"acc_stderr": 0.031543900001055364,
"acc_norm": 0.2811714189383077,
"acc_norm_stderr": 0.032378381556401825,
"mc1": 0.1909424724602203,
"mc1_stderr": 0.013759285842685718,
"mc2": 0.33931994336755883,
"mc2_stderr": 0.014516204773412781
},
"harness|arc:challenge|25": {
"acc": 0.31143344709897613,
"acc_stderr": 0.013532472099850945,
"acc_norm": 0.32849829351535836,
"acc_norm_stderr": 0.013724978465537371
},
"harness|hellaswag|10": {
"acc": 0.46355307707627963,
"acc_stderr": 0.004976507121076265,
"acc_norm": 0.6105357498506274,
"acc_norm_stderr": 0.004866322258335982
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03591444084196969,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03591444084196969
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2943396226415094,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.2943396226415094,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21929824561403508,
"acc_stderr": 0.038924311065187546,
"acc_norm": 0.21929824561403508,
"acc_norm_stderr": 0.038924311065187546
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918407,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918407
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.038932596106046734,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.038932596106046734
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.267741935483871,
"acc_stderr": 0.025189006660212378,
"acc_norm": 0.267741935483871,
"acc_norm_stderr": 0.025189006660212378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.29064039408866993,
"acc_stderr": 0.03194740072265541,
"acc_norm": 0.29064039408866993,
"acc_norm_stderr": 0.03194740072265541
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.03332299921070644,
"acc_norm": 0.32323232323232326,
"acc_norm_stderr": 0.03332299921070644
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.35233160621761656,
"acc_stderr": 0.03447478286414358,
"acc_norm": 0.35233160621761656,
"acc_norm_stderr": 0.03447478286414358
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712173,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712173
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3067226890756303,
"acc_stderr": 0.029953823891887048,
"acc_norm": 0.3067226890756303,
"acc_norm_stderr": 0.029953823891887048
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.30458715596330277,
"acc_stderr": 0.01973229942035405,
"acc_norm": 0.30458715596330277,
"acc_norm_stderr": 0.01973229942035405
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.03228210387037894,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.03228210387037894
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23766816143497757,
"acc_stderr": 0.028568079464714274,
"acc_norm": 0.23766816143497757,
"acc_norm_stderr": 0.028568079464714274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847834,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847834
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2892561983471074,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.2892561983471074,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.03642914578292404,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.03642914578292404
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.16071428571428573,
"acc_stderr": 0.03485946096475741,
"acc_norm": 0.16071428571428573,
"acc_norm_stderr": 0.03485946096475741
},
"harness|hendrycksTest-management|5": {
"acc": 0.33980582524271846,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.33980582524271846,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.029614323690456648,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.029614323690456648
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.17,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.17,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398698,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398698
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961459,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961459
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.02633661346904663,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.02633661346904663
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2379421221864952,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.2379421221864952,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25617283950617287,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.25617283950617287,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.02577001564429039,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.02577001564429039
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2588005215123859,
"acc_stderr": 0.011186109046564613,
"acc_norm": 0.2588005215123859,
"acc_norm_stderr": 0.011186109046564613
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877753,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877753
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.01724238582877961,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.01724238582877961
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.263681592039801,
"acc_stderr": 0.03115715086935554,
"acc_norm": 0.263681592039801,
"acc_norm_stderr": 0.03115715086935554
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071856,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071856
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.1909424724602203,
"mc1_stderr": 0.013759285842685718,
"mc2": 0.33931994336755883,
"mc2_stderr": 0.014516204773412781
},
"harness|winogrande|5": {
"acc": 0.5643251775848461,
"acc_stderr": 0.013935709739615712
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_KnutJaegersberg__Walter-Llama-1B | [
"region:us"
] | 2023-12-13T10:36:10+00:00 | {"pretty_name": "Evaluation run of KnutJaegersberg/Walter-Llama-1B", "dataset_summary": "Dataset automatically created during the evaluation run of model [KnutJaegersberg/Walter-Llama-1B](https://huggingface.co/KnutJaegersberg/Walter-Llama-1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__Walter-Llama-1B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T10:33:54.615691](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__Walter-Llama-1B/blob/main/results_2023-12-13T10-33-54.615691.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27851461271728306,\n \"acc_stderr\": 0.031543900001055364,\n \"acc_norm\": 0.2811714189383077,\n \"acc_norm_stderr\": 0.032378381556401825,\n \"mc1\": 0.1909424724602203,\n \"mc1_stderr\": 0.013759285842685718,\n \"mc2\": 0.33931994336755883,\n \"mc2_stderr\": 0.014516204773412781\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.31143344709897613,\n \"acc_stderr\": 0.013532472099850945,\n \"acc_norm\": 0.32849829351535836,\n \"acc_norm_stderr\": 0.013724978465537371\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46355307707627963,\n \"acc_stderr\": 0.004976507121076265,\n \"acc_norm\": 0.6105357498506274,\n \"acc_norm_stderr\": 0.004866322258335982\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03591444084196969,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03591444084196969\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2943396226415094,\n \"acc_stderr\": 0.028049186315695248,\n \"acc_norm\": 0.2943396226415094,\n \"acc_norm_stderr\": 0.028049186315695248\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.2708333333333333,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21929824561403508,\n \"acc_stderr\": 0.038924311065187546,\n \"acc_norm\": 0.21929824561403508,\n \"acc_norm_stderr\": 0.038924311065187546\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918407,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918407\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.038932596106046734,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.038932596106046734\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.267741935483871,\n \"acc_stderr\": 0.025189006660212378,\n \"acc_norm\": 0.267741935483871,\n \"acc_norm_stderr\": 0.025189006660212378\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.29064039408866993,\n \"acc_stderr\": 0.03194740072265541,\n \"acc_norm\": 0.29064039408866993,\n \"acc_norm_stderr\": 0.03194740072265541\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.32323232323232326,\n \"acc_stderr\": 0.03332299921070644,\n \"acc_norm\": 0.32323232323232326,\n \"acc_norm_stderr\": 0.03332299921070644\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.35233160621761656,\n \"acc_stderr\": 0.03447478286414358,\n \"acc_norm\": 0.35233160621761656,\n \"acc_norm_stderr\": 0.03447478286414358\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.023946724741563976,\n \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.023946724741563976\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712173,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712173\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3067226890756303,\n \"acc_stderr\": 0.029953823891887048,\n \"acc_norm\": 0.3067226890756303,\n \"acc_norm_stderr\": 0.029953823891887048\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.30458715596330277,\n \"acc_stderr\": 0.01973229942035405,\n \"acc_norm\": 0.30458715596330277,\n \"acc_norm_stderr\": 0.01973229942035405\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.03228210387037894,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.03228210387037894\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23766816143497757,\n \"acc_stderr\": 0.028568079464714274,\n \"acc_norm\": 0.23766816143497757,\n \"acc_norm_stderr\": 0.028568079464714274\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847834,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847834\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2892561983471074,\n \"acc_stderr\": 0.041391127276354626,\n \"acc_norm\": 0.2892561983471074,\n \"acc_norm_stderr\": 0.041391127276354626\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.03642914578292404,\n \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.03642914578292404\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.16071428571428573,\n \"acc_stderr\": 0.03485946096475741,\n \"acc_norm\": 0.16071428571428573,\n \"acc_norm_stderr\": 0.03485946096475741\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.33980582524271846,\n \"acc_stderr\": 0.046897659372781335,\n \"acc_norm\": 0.33980582524271846,\n \"acc_norm_stderr\": 0.046897659372781335\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n \"acc_stderr\": 0.029614323690456648,\n \"acc_norm\": 0.2863247863247863,\n \"acc_norm_stderr\": 0.029614323690456648\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.17,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n \"acc_stderr\": 0.015246803197398698,\n \"acc_norm\": 0.2388250319284802,\n \"acc_norm_stderr\": 0.015246803197398698\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.023618678310069363,\n \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.023618678310069363\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n \"acc_stderr\": 0.014310999547961459,\n \"acc_norm\": 0.24134078212290502,\n \"acc_norm_stderr\": 0.014310999547961459\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.02633661346904663,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.02633661346904663\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2379421221864952,\n \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.2379421221864952,\n \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.25617283950617287,\n \"acc_stderr\": 0.0242885336377261,\n \"acc_norm\": 0.25617283950617287,\n \"acc_norm_stderr\": 0.0242885336377261\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.24822695035460993,\n \"acc_stderr\": 0.02577001564429039,\n \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.02577001564429039\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2588005215123859,\n \"acc_stderr\": 0.011186109046564613,\n \"acc_norm\": 0.2588005215123859,\n \"acc_norm_stderr\": 0.011186109046564613\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877753,\n \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877753\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.238562091503268,\n \"acc_stderr\": 0.01724238582877961,\n \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.01724238582877961\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.263681592039801,\n \"acc_stderr\": 0.03115715086935554,\n \"acc_norm\": 0.263681592039801,\n \"acc_norm_stderr\": 0.03115715086935554\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.25903614457831325,\n \"acc_stderr\": 0.03410646614071856,\n \"acc_norm\": 0.25903614457831325,\n \"acc_norm_stderr\": 0.03410646614071856\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.032180937956023566,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.032180937956023566\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.1909424724602203,\n \"mc1_stderr\": 0.013759285842685718,\n \"mc2\": 0.33931994336755883,\n \"mc2_stderr\": 0.014516204773412781\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5643251775848461,\n \"acc_stderr\": 0.013935709739615712\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/KnutJaegersberg/Walter-Llama-1B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|arc:challenge|25_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|gsm8k|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hellaswag|10_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T10-33-54.615691.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["**/details_harness|winogrande|5_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T10-33-54.615691.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T10_33_54.615691", "path": ["results_2023-12-13T10-33-54.615691.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T10-33-54.615691.parquet"]}]}]} | 2023-12-13T10:36:54+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Llama-1B
Dataset automatically created during the evaluation run of model KnutJaegersberg/Walter-Llama-1B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T10:33:54.615691(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Llama-1B\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Walter-Llama-1B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T10:33:54.615691(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Llama-1B\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Walter-Llama-1B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T10:33:54.615691(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of KnutJaegersberg/Walter-Llama-1B\n\n\n\nDataset automatically created during the evaluation run of model KnutJaegersberg/Walter-Llama-1B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T10:33:54.615691(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
cca6ac79945a4c65286cf58490e813d3d517542f | # Dataset Card for "safety_en_14k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | nguyenthanhdo/safety_en_14k | [
"region:us"
] | 2023-12-13T10:49:32+00:00 | {"dataset_info": {"features": [{"name": "prompt", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "type", "dtype": "string"}, {"name": "split", "dtype": "string"}, {"name": "translated", "dtype": "bool"}], "splits": [{"name": "train", "num_bytes": 10930380, "num_examples": 14187}], "download_size": 5673826, "dataset_size": 10930380}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-13T10:49:37+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "safety_en_14k"
More Information needed | [
"# Dataset Card for \"safety_en_14k\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"safety_en_14k\"\n\nMore Information needed"
] | [
6,
17
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"safety_en_14k\"\n\nMore Information needed"
] |
26941bec0f119a60314a4da0df27a4a729f7adc8 |
# Dataset Card for Evaluation run of Sao10K/Ana-v1-m7
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Sao10K/Ana-v1-m7](https://huggingface.co/Sao10K/Ana-v1-m7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Sao10K__Ana-v1-m7",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T11:00:42.385261](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Ana-v1-m7/blob/main/results_2023-12-13T11-00-42.385261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6450876976103553,
"acc_stderr": 0.032232788125298194,
"acc_norm": 0.6484944592637535,
"acc_norm_stderr": 0.032869268033759086,
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023515,
"mc2": 0.5503162010975955,
"mc2_stderr": 0.01574450133768535
},
"harness|arc:challenge|25": {
"acc": 0.6407849829351536,
"acc_stderr": 0.014020224155839152,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.013697432466693247
},
"harness|hellaswag|10": {
"acc": 0.6762597092212707,
"acc_stderr": 0.0046694598919176915,
"acc_norm": 0.859788886675961,
"acc_norm_stderr": 0.0034649633793799287
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.049598599663841815,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.049598599663841815
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.0253795249107784,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.0253795249107784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188712,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188712
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6230769230769231,
"acc_stderr": 0.024570975364225995,
"acc_norm": 0.6230769230769231,
"acc_norm_stderr": 0.024570975364225995
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.034063153607115065,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.034063153607115065
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5357142857142857,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.5357142857142857,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8288633461047255,
"acc_stderr": 0.013468201614066302,
"acc_norm": 0.8288633461047255,
"acc_norm_stderr": 0.013468201614066302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310263,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.028661996202335303,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.028661996202335303
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291286,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291286
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.03061111655743253,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.03061111655743253
},
"harness|truthfulqa:mc|0": {
"mc1": 0.39657282741738065,
"mc1_stderr": 0.017124930942023515,
"mc2": 0.5503162010975955,
"mc2_stderr": 0.01574450133768535
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.011631268360607778
},
"harness|gsm8k|5": {
"acc": 0.5253980288097043,
"acc_stderr": 0.01375470508911231
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Sao10K__Ana-v1-m7 | [
"region:us"
] | 2023-12-13T11:03:33+00:00 | {"pretty_name": "Evaluation run of Sao10K/Ana-v1-m7", "dataset_summary": "Dataset automatically created during the evaluation run of model [Sao10K/Ana-v1-m7](https://huggingface.co/Sao10K/Ana-v1-m7) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Sao10K__Ana-v1-m7\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T11:00:42.385261](https://huggingface.co/datasets/open-llm-leaderboard/details_Sao10K__Ana-v1-m7/blob/main/results_2023-12-13T11-00-42.385261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6450876976103553,\n \"acc_stderr\": 0.032232788125298194,\n \"acc_norm\": 0.6484944592637535,\n \"acc_norm_stderr\": 0.032869268033759086,\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.5503162010975955,\n \"mc2_stderr\": 0.01574450133768535\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6407849829351536,\n \"acc_stderr\": 0.014020224155839152,\n \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.013697432466693247\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6762597092212707,\n \"acc_stderr\": 0.0046694598919176915,\n \"acc_norm\": 0.859788886675961,\n \"acc_norm_stderr\": 0.0034649633793799287\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.049598599663841815,\n \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.049598599663841815\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41534391534391535,\n \"acc_stderr\": 0.0253795249107784,\n \"acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.0253795249107784\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n \"acc_stderr\": 0.023025899617188712,\n \"acc_norm\": 0.7935483870967742,\n \"acc_norm_stderr\": 0.023025899617188712\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6230769230769231,\n \"acc_stderr\": 0.024570975364225995,\n \"acc_norm\": 0.6230769230769231,\n \"acc_norm_stderr\": 0.024570975364225995\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.034063153607115065,\n \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.034063153607115065\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5357142857142857,\n \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.5357142857142857,\n \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8288633461047255,\n \"acc_stderr\": 0.013468201614066302,\n \"acc_norm\": 0.8288633461047255,\n \"acc_norm_stderr\": 0.013468201614066302\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n \"acc_stderr\": 0.016051419760310263,\n \"acc_norm\": 0.35977653631284917,\n \"acc_norm_stderr\": 0.016051419760310263\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.028661996202335303,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.028661996202335303\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291286,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291286\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.03061111655743253,\n \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.03061111655743253\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.39657282741738065,\n \"mc1_stderr\": 0.017124930942023515,\n \"mc2\": 0.5503162010975955,\n \"mc2_stderr\": 0.01574450133768535\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.011631268360607778\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5253980288097043,\n \"acc_stderr\": 0.01375470508911231\n }\n}\n```", "repo_url": "https://huggingface.co/Sao10K/Ana-v1-m7", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|arc:challenge|25_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|gsm8k|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hellaswag|10_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T11-00-42.385261.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["**/details_harness|winogrande|5_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T11-00-42.385261.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T11_00_42.385261", "path": ["results_2023-12-13T11-00-42.385261.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T11-00-42.385261.parquet"]}]}]} | 2023-12-13T11:04:14+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Sao10K/Ana-v1-m7
Dataset automatically created during the evaluation run of model Sao10K/Ana-v1-m7 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T11:00:42.385261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Sao10K/Ana-v1-m7\n\n\n\nDataset automatically created during the evaluation run of model Sao10K/Ana-v1-m7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T11:00:42.385261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Sao10K/Ana-v1-m7\n\n\n\nDataset automatically created during the evaluation run of model Sao10K/Ana-v1-m7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T11:00:42.385261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Sao10K/Ana-v1-m7\n\n\n\nDataset automatically created during the evaluation run of model Sao10K/Ana-v1-m7 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T11:00:42.385261(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
fc670f3778ed358912e624dd6f927a6181715ade |
# Dataset Card for Evaluation run of PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp](https://huggingface.co/PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PistachioAlt__Synatra-MCS-7B-v0.3-RP-Slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T11:03:01.883125](https://huggingface.co/datasets/open-llm-leaderboard/details_PistachioAlt__Synatra-MCS-7B-v0.3-RP-Slerp/blob/main/results_2023-12-13T11-03-01.883125.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6393052547076683,
"acc_stderr": 0.03226129096267341,
"acc_norm": 0.640236665344228,
"acc_norm_stderr": 0.032916099826004463,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814038,
"mc2": 0.5393109292355184,
"mc2_stderr": 0.015020765741482643
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303098,
"acc_norm": 0.6663822525597269,
"acc_norm_stderr": 0.013778687054176538
},
"harness|hellaswag|10": {
"acc": 0.6587333200557658,
"acc_stderr": 0.004731657228906991,
"acc_norm": 0.8497311292571201,
"acc_norm_stderr": 0.0035660447773274194
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532265,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532265
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138204,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138204
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.02424378399406216,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.02424378399406216
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8365261813537676,
"acc_stderr": 0.013223928616741612,
"acc_norm": 0.8365261813537676,
"acc_norm_stderr": 0.013223928616741612
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3418994413407821,
"acc_stderr": 0.015864506461604644,
"acc_norm": 0.3418994413407821,
"acc_norm_stderr": 0.015864506461604644
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277738,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277738
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.019139943748487036,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.019139943748487036
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814038,
"mc2": 0.5393109292355184,
"mc2_stderr": 0.015020765741482643
},
"harness|winogrande|5": {
"acc": 0.7971586424625099,
"acc_stderr": 0.011301439925936648
},
"harness|gsm8k|5": {
"acc": 0.6618650492797574,
"acc_stderr": 0.013030829145172226
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_PistachioAlt__Synatra-MCS-7B-v0.3-RP-Slerp | [
"region:us"
] | 2023-12-13T11:05:51+00:00 | {"pretty_name": "Evaluation run of PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp", "dataset_summary": "Dataset automatically created during the evaluation run of model [PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp](https://huggingface.co/PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PistachioAlt__Synatra-MCS-7B-v0.3-RP-Slerp\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T11:03:01.883125](https://huggingface.co/datasets/open-llm-leaderboard/details_PistachioAlt__Synatra-MCS-7B-v0.3-RP-Slerp/blob/main/results_2023-12-13T11-03-01.883125.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6393052547076683,\n \"acc_stderr\": 0.03226129096267341,\n \"acc_norm\": 0.640236665344228,\n \"acc_norm_stderr\": 0.032916099826004463,\n \"mc1\": 0.37209302325581395,\n \"mc1_stderr\": 0.016921090118814038,\n \"mc2\": 0.5393109292355184,\n \"mc2_stderr\": 0.015020765741482643\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303098,\n \"acc_norm\": 0.6663822525597269,\n \"acc_norm_stderr\": 0.013778687054176538\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6587333200557658,\n \"acc_stderr\": 0.004731657228906991,\n \"acc_norm\": 0.8497311292571201,\n \"acc_norm_stderr\": 0.0035660447773274194\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138204,\n \"acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138204\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.02424378399406216,\n \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.02424378399406216\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8365261813537676,\n \"acc_stderr\": 0.013223928616741612,\n \"acc_norm\": 0.8365261813537676,\n \"acc_norm_stderr\": 0.013223928616741612\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3418994413407821,\n \"acc_stderr\": 0.015864506461604644,\n \"acc_norm\": 0.3418994413407821,\n \"acc_norm_stderr\": 0.015864506461604644\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n \"acc_stderr\": 0.012715404841277738,\n \"acc_norm\": 0.45371577574967403,\n \"acc_norm_stderr\": 0.012715404841277738\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.019139943748487036,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.019139943748487036\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n \"mc1_stderr\": 0.016921090118814038,\n \"mc2\": 0.5393109292355184,\n \"mc2_stderr\": 0.015020765741482643\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7971586424625099,\n \"acc_stderr\": 0.011301439925936648\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6618650492797574,\n \"acc_stderr\": 0.013030829145172226\n }\n}\n```", "repo_url": "https://huggingface.co/PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|arc:challenge|25_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|gsm8k|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hellaswag|10_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T11-03-01.883125.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["**/details_harness|winogrande|5_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T11-03-01.883125.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T11_03_01.883125", "path": ["results_2023-12-13T11-03-01.883125.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T11-03-01.883125.parquet"]}]}]} | 2023-12-13T11:06:34+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp
Dataset automatically created during the evaluation run of model PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T11:03:01.883125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp\n\n\n\nDataset automatically created during the evaluation run of model PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T11:03:01.883125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp\n\n\n\nDataset automatically created during the evaluation run of model PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T11:03:01.883125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
205,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp\n\n\n\nDataset automatically created during the evaluation run of model PistachioAlt/Synatra-MCS-7B-v0.3-RP-Slerp on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T11:03:01.883125(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
88673ab295e8af4a5ba99b03c7a7b2ab2c2d31e5 |
# All India Bar Exam (AIBE) Dataset
<!-- Provide a quick summary of the dataset. -->
Welcome to the All India Bar Exam (AIBE) Dataset! This dataset is designed to facilitate the evaluation of Natural Language Processing (NLP) models, particularly in the field of legal studies. It contains a collection of questions from various AIBE exams, along with multiple-choice options and correct answers.
## Purpose
The primary purpose of this dataset is to serve as a valuable resource for researchers, developers, and practitioners working on legal NLP and large language models. By using this dataset, you can evaluate the performance of your models in understanding and processing legal questions, which can be crucial for applications such as legal document analysis, legal chatbots, and more.
### Content
The dataset includes questions from past 12 years AIBE exams ranging from AIBE 4 to AIBE 16.
* A collection of questions from different All India Bar Exams.
* Multiple-choice options are associated with each question.
* The correct answer for each question.
### Intended Use
This dataset is intended for evaluation purposes only. Please refrain from altering the dataset to maintain its integrity and ensure fair evaluations. Users are encouraged to use the dataset responsibly, respecting legal and ethical guidelines.
<!--
### Citation
If you use this dataset in your work or research, please cite it appropriately to acknowledge the source:
```bibtex
[Author Name(s), TO BE UPDATED]
```
-->
### Usage Guidelines
* Evaluation Only: Use this dataset solely for evaluating the performance of NLP models on legal questions.
* No Alterations: Do not alter the dataset. Any modifications may compromise the integrity of the evaluation.
* Attribution: If you use this dataset in your research or projects, kindly attribute it as specified in the citation section.
### Disclaimer
The dataset is provided "as is" without any warranty. The authors and contributors are not responsible for any errors or omissions in the dataset. Use it at your own discretion.
### Feedback
We welcome feedback, suggestions, or any issues you may encounter with the dataset. Feel free to contribute to its improvement by providing feedback on the Hugging Face platform or by contacting the dataset maintainers.
Thank you for using the All India Bar Exam (AIBE) Dataset! We hope it proves to be a valuable asset for your research and development in the field of legal NLP. | opennyaiorg/aibe_dataset | [
"task_categories:question-answering",
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-nd-4.0",
"legal",
"region:us"
] | 2023-12-13T11:05:52+00:00 | {"language": ["en"], "license": "cc-by-nd-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["question-answering"], "pretty_name": "AIBE Dataset", "tags": ["legal"], "dataset_info": {"features": [{"name": "exam_name", "dtype": "string"}, {"name": "exam_number", "dtype": "string"}, {"name": "question_number", "dtype": "int64"}, {"name": "question_text", "dtype": "string"}, {"name": "options", "struct": [{"name": "A", "dtype": "string"}, {"name": "B", "dtype": "string"}, {"name": "C", "dtype": "string"}, {"name": "D", "dtype": "string"}]}, {"name": "answer", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 297975, "num_examples": 1157}], "download_size": 184347, "dataset_size": 297975}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-19T06:19:37+00:00 | [] | [
"en"
] | TAGS
#task_categories-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-nd-4.0 #legal #region-us
|
# All India Bar Exam (AIBE) Dataset
Welcome to the All India Bar Exam (AIBE) Dataset! This dataset is designed to facilitate the evaluation of Natural Language Processing (NLP) models, particularly in the field of legal studies. It contains a collection of questions from various AIBE exams, along with multiple-choice options and correct answers.
## Purpose
The primary purpose of this dataset is to serve as a valuable resource for researchers, developers, and practitioners working on legal NLP and large language models. By using this dataset, you can evaluate the performance of your models in understanding and processing legal questions, which can be crucial for applications such as legal document analysis, legal chatbots, and more.
### Content
The dataset includes questions from past 12 years AIBE exams ranging from AIBE 4 to AIBE 16.
* A collection of questions from different All India Bar Exams.
* Multiple-choice options are associated with each question.
* The correct answer for each question.
### Intended Use
This dataset is intended for evaluation purposes only. Please refrain from altering the dataset to maintain its integrity and ensure fair evaluations. Users are encouraged to use the dataset responsibly, respecting legal and ethical guidelines.
### Usage Guidelines
* Evaluation Only: Use this dataset solely for evaluating the performance of NLP models on legal questions.
* No Alterations: Do not alter the dataset. Any modifications may compromise the integrity of the evaluation.
* Attribution: If you use this dataset in your research or projects, kindly attribute it as specified in the citation section.
### Disclaimer
The dataset is provided "as is" without any warranty. The authors and contributors are not responsible for any errors or omissions in the dataset. Use it at your own discretion.
### Feedback
We welcome feedback, suggestions, or any issues you may encounter with the dataset. Feel free to contribute to its improvement by providing feedback on the Hugging Face platform or by contacting the dataset maintainers.
Thank you for using the All India Bar Exam (AIBE) Dataset! We hope it proves to be a valuable asset for your research and development in the field of legal NLP. | [
"# All India Bar Exam (AIBE) Dataset\n\n\n\nWelcome to the All India Bar Exam (AIBE) Dataset! This dataset is designed to facilitate the evaluation of Natural Language Processing (NLP) models, particularly in the field of legal studies. It contains a collection of questions from various AIBE exams, along with multiple-choice options and correct answers.",
"## Purpose\n\nThe primary purpose of this dataset is to serve as a valuable resource for researchers, developers, and practitioners working on legal NLP and large language models. By using this dataset, you can evaluate the performance of your models in understanding and processing legal questions, which can be crucial for applications such as legal document analysis, legal chatbots, and more.",
"### Content\nThe dataset includes questions from past 12 years AIBE exams ranging from AIBE 4 to AIBE 16.\n\n* A collection of questions from different All India Bar Exams.\n* Multiple-choice options are associated with each question.\n* The correct answer for each question.",
"### Intended Use\nThis dataset is intended for evaluation purposes only. Please refrain from altering the dataset to maintain its integrity and ensure fair evaluations. Users are encouraged to use the dataset responsibly, respecting legal and ethical guidelines.",
"### Usage Guidelines\n* Evaluation Only: Use this dataset solely for evaluating the performance of NLP models on legal questions.\n* No Alterations: Do not alter the dataset. Any modifications may compromise the integrity of the evaluation.\n* Attribution: If you use this dataset in your research or projects, kindly attribute it as specified in the citation section.",
"### Disclaimer\nThe dataset is provided \"as is\" without any warranty. The authors and contributors are not responsible for any errors or omissions in the dataset. Use it at your own discretion.",
"### Feedback\nWe welcome feedback, suggestions, or any issues you may encounter with the dataset. Feel free to contribute to its improvement by providing feedback on the Hugging Face platform or by contacting the dataset maintainers.\n\nThank you for using the All India Bar Exam (AIBE) Dataset! We hope it proves to be a valuable asset for your research and development in the field of legal NLP."
] | [
"TAGS\n#task_categories-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-nd-4.0 #legal #region-us \n",
"# All India Bar Exam (AIBE) Dataset\n\n\n\nWelcome to the All India Bar Exam (AIBE) Dataset! This dataset is designed to facilitate the evaluation of Natural Language Processing (NLP) models, particularly in the field of legal studies. It contains a collection of questions from various AIBE exams, along with multiple-choice options and correct answers.",
"## Purpose\n\nThe primary purpose of this dataset is to serve as a valuable resource for researchers, developers, and practitioners working on legal NLP and large language models. By using this dataset, you can evaluate the performance of your models in understanding and processing legal questions, which can be crucial for applications such as legal document analysis, legal chatbots, and more.",
"### Content\nThe dataset includes questions from past 12 years AIBE exams ranging from AIBE 4 to AIBE 16.\n\n* A collection of questions from different All India Bar Exams.\n* Multiple-choice options are associated with each question.\n* The correct answer for each question.",
"### Intended Use\nThis dataset is intended for evaluation purposes only. Please refrain from altering the dataset to maintain its integrity and ensure fair evaluations. Users are encouraged to use the dataset responsibly, respecting legal and ethical guidelines.",
"### Usage Guidelines\n* Evaluation Only: Use this dataset solely for evaluating the performance of NLP models on legal questions.\n* No Alterations: Do not alter the dataset. Any modifications may compromise the integrity of the evaluation.\n* Attribution: If you use this dataset in your research or projects, kindly attribute it as specified in the citation section.",
"### Disclaimer\nThe dataset is provided \"as is\" without any warranty. The authors and contributors are not responsible for any errors or omissions in the dataset. Use it at your own discretion.",
"### Feedback\nWe welcome feedback, suggestions, or any issues you may encounter with the dataset. Feel free to contribute to its improvement by providing feedback on the Hugging Face platform or by contacting the dataset maintainers.\n\nThank you for using the All India Bar Exam (AIBE) Dataset! We hope it proves to be a valuable asset for your research and development in the field of legal NLP."
] | [
47,
80,
82,
60,
60,
84,
46,
86
] | [
"passage: TAGS\n#task_categories-question-answering #size_categories-1K<n<10K #language-English #license-cc-by-nd-4.0 #legal #region-us \n# All India Bar Exam (AIBE) Dataset\n\n\n\nWelcome to the All India Bar Exam (AIBE) Dataset! This dataset is designed to facilitate the evaluation of Natural Language Processing (NLP) models, particularly in the field of legal studies. It contains a collection of questions from various AIBE exams, along with multiple-choice options and correct answers.## Purpose\n\nThe primary purpose of this dataset is to serve as a valuable resource for researchers, developers, and practitioners working on legal NLP and large language models. By using this dataset, you can evaluate the performance of your models in understanding and processing legal questions, which can be crucial for applications such as legal document analysis, legal chatbots, and more.### Content\nThe dataset includes questions from past 12 years AIBE exams ranging from AIBE 4 to AIBE 16.\n\n* A collection of questions from different All India Bar Exams.\n* Multiple-choice options are associated with each question.\n* The correct answer for each question.### Intended Use\nThis dataset is intended for evaluation purposes only. Please refrain from altering the dataset to maintain its integrity and ensure fair evaluations. Users are encouraged to use the dataset responsibly, respecting legal and ethical guidelines.### Usage Guidelines\n* Evaluation Only: Use this dataset solely for evaluating the performance of NLP models on legal questions.\n* No Alterations: Do not alter the dataset. Any modifications may compromise the integrity of the evaluation.\n* Attribution: If you use this dataset in your research or projects, kindly attribute it as specified in the citation section.### Disclaimer\nThe dataset is provided \"as is\" without any warranty. The authors and contributors are not responsible for any errors or omissions in the dataset. Use it at your own discretion."
] |
e6034b92c85e6061bc508d2011f2328d95282b8e |
# Dataset Card for Evaluation run of uukuguy/speechless-mistral-six-in-one-7b-orth-1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [uukuguy/speechless-mistral-six-in-one-7b-orth-1.0](https://huggingface.co/uukuguy/speechless-mistral-six-in-one-7b-orth-1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-mistral-six-in-one-7b-orth-1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T11:13:22.485134](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-mistral-six-in-one-7b-orth-1.0/blob/main/results_2023-12-13T11-13-22.485134.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23196194129343728,
"acc_stderr": 0.029934654752561563,
"acc_norm": 0.2314240573187148,
"acc_norm_stderr": 0.03071122006512167,
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132861,
"acc_norm": 0.22696245733788395,
"acc_norm_stderr": 0.012240491536132861
},
"harness|hellaswag|10": {
"acc": 0.2504481179047998,
"acc_stderr": 0.004323856300539177,
"acc_norm": 0.2504481179047998,
"acc_norm_stderr": 0.004323856300539177
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23754789272030652,
"acc_stderr": 0.015218733046150193,
"acc_norm": 0.23754789272030652,
"acc_norm_stderr": 0.015218733046150193
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 1.0,
"mc1_stderr": 0.0,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076911
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_uukuguy__speechless-mistral-six-in-one-7b-orth-1.0 | [
"region:us"
] | 2023-12-13T11:16:12+00:00 | {"pretty_name": "Evaluation run of uukuguy/speechless-mistral-six-in-one-7b-orth-1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [uukuguy/speechless-mistral-six-in-one-7b-orth-1.0](https://huggingface.co/uukuguy/speechless-mistral-six-in-one-7b-orth-1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-mistral-six-in-one-7b-orth-1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T11:13:22.485134](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-mistral-six-in-one-7b-orth-1.0/blob/main/results_2023-12-13T11-13-22.485134.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23196194129343728,\n \"acc_stderr\": 0.029934654752561563,\n \"acc_norm\": 0.2314240573187148,\n \"acc_norm_stderr\": 0.03071122006512167,\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132861,\n \"acc_norm\": 0.22696245733788395,\n \"acc_norm_stderr\": 0.012240491536132861\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2504481179047998,\n \"acc_stderr\": 0.004323856300539177,\n \"acc_norm\": 0.2504481179047998,\n \"acc_norm_stderr\": 0.004323856300539177\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23754789272030652,\n \"acc_stderr\": 0.015218733046150193,\n \"acc_norm\": 0.23754789272030652,\n \"acc_norm_stderr\": 0.015218733046150193\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 1.0,\n \"mc1_stderr\": 0.0,\n \"mc2\": NaN,\n \"mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076911\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/uukuguy/speechless-mistral-six-in-one-7b-orth-1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|arc:challenge|25_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|gsm8k|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hellaswag|10_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T11-13-22.485134.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["**/details_harness|winogrande|5_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T11-13-22.485134.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T11_13_22.485134", "path": ["results_2023-12-13T11-13-22.485134.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T11-13-22.485134.parquet"]}]}]} | 2023-12-13T11:16:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of uukuguy/speechless-mistral-six-in-one-7b-orth-1.0
Dataset automatically created during the evaluation run of model uukuguy/speechless-mistral-six-in-one-7b-orth-1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T11:13:22.485134(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of uukuguy/speechless-mistral-six-in-one-7b-orth-1.0\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-mistral-six-in-one-7b-orth-1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T11:13:22.485134(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of uukuguy/speechless-mistral-six-in-one-7b-orth-1.0\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-mistral-six-in-one-7b-orth-1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T11:13:22.485134(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
211,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of uukuguy/speechless-mistral-six-in-one-7b-orth-1.0\n\n\n\nDataset automatically created during the evaluation run of model uukuguy/speechless-mistral-six-in-one-7b-orth-1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T11:13:22.485134(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
a12bc4d7eee81725b65037f2857a05bf60b77aad |
Data for the study were collected from the 5 largest and most active communities of the VKontakte social network, dedicated to local city news and events in a town with a population of less than 100 thousand in the Nizhny Novgorod region of Russia. The number of subscribers in these social media communities varies from 12 to 43 thousand.
A total of 662881 comments were collected.
The Trager coefficient was calculated by dividing the number of verbs by the number of adjectives in each comment. These values were averaged for each day, thus forming the final dataset. | Maxstan/trager_coef_by_date | [
"license:cc-by-4.0",
"doi:10.57967/hf/1475",
"region:us"
] | 2023-12-13T11:29:14+00:00 | {"license": "cc-by-4.0"} | 2023-12-13T11:48:22+00:00 | [] | [] | TAGS
#license-cc-by-4.0 #doi-10.57967/hf/1475 #region-us
|
Data for the study were collected from the 5 largest and most active communities of the VKontakte social network, dedicated to local city news and events in a town with a population of less than 100 thousand in the Nizhny Novgorod region of Russia. The number of subscribers in these social media communities varies from 12 to 43 thousand.
A total of 662881 comments were collected.
The Trager coefficient was calculated by dividing the number of verbs by the number of adjectives in each comment. These values were averaged for each day, thus forming the final dataset. | [] | [
"TAGS\n#license-cc-by-4.0 #doi-10.57967/hf/1475 #region-us \n"
] | [
27
] | [
"passage: TAGS\n#license-cc-by-4.0 #doi-10.57967/hf/1475 #region-us \n"
] |
7f3a0f30e1ac23074dd7b888e1eda1011175d74d |
# Dataset of yato (Arknights)
This is the dataset of yato (Arknights), containing 90 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 90 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 227 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 243 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 90 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 90 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 90 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 227 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 227 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 175 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 243 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 243 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| AppleHarem/yato_arknights | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-12-13T11:29:57+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-13T16:18:22+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of yato (Arknights)
===========================
This is the dataset of yato (Arknights), containing 90 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
This is a WebUI contains crawlers and other thing: (LittleAppleWebUI)
| [] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] | [
44
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
741b88cc6e59760c3816a9642d69ae3a1a0cca71 |
# Dataset Card for Evaluation run of athirdpath/NSFW_DPO_Noromaid-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [athirdpath/NSFW_DPO_Noromaid-7b](https://huggingface.co/athirdpath/NSFW_DPO_Noromaid-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_athirdpath__NSFW_DPO_Noromaid-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T11:49:19.833498](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__NSFW_DPO_Noromaid-7b/blob/main/results_2023-12-13T11-49-19.833498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6307501582947606,
"acc_stderr": 0.03259733263931875,
"acc_norm": 0.6368306168876743,
"acc_norm_stderr": 0.03325789211362751,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.44993493459502887,
"mc2_stderr": 0.014562901933937895
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6262798634812287,
"acc_norm_stderr": 0.014137708601759086
},
"harness|hellaswag|10": {
"acc": 0.6461860187213703,
"acc_stderr": 0.004771751187407022,
"acc_norm": 0.8449512049392551,
"acc_norm_stderr": 0.0036121146706989786
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.025305906241590632,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.025305906241590632
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.02446861524147892,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.02446861524147892
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.03086868260412163,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.03086868260412163
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.423841059602649,
"acc_stderr": 0.04034846678603397,
"acc_norm": 0.423841059602649,
"acc_norm_stderr": 0.04034846678603397
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399327,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399327
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636863,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636863
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22569832402234638,
"acc_stderr": 0.013981395058455052,
"acc_norm": 0.22569832402234638,
"acc_norm_stderr": 0.013981395058455052
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493274,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493274
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6580882352941176,
"acc_stderr": 0.02881472242225418,
"acc_norm": 0.6580882352941176,
"acc_norm_stderr": 0.02881472242225418
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.01920660684882536,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.01920660684882536
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.44993493459502887,
"mc2_stderr": 0.014562901933937895
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.35860500379075055,
"acc_stderr": 0.013210317364134031
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_athirdpath__NSFW_DPO_Noromaid-7b | [
"region:us"
] | 2023-12-13T11:52:10+00:00 | {"pretty_name": "Evaluation run of athirdpath/NSFW_DPO_Noromaid-7b", "dataset_summary": "Dataset automatically created during the evaluation run of model [athirdpath/NSFW_DPO_Noromaid-7b](https://huggingface.co/athirdpath/NSFW_DPO_Noromaid-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_athirdpath__NSFW_DPO_Noromaid-7b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T11:49:19.833498](https://huggingface.co/datasets/open-llm-leaderboard/details_athirdpath__NSFW_DPO_Noromaid-7b/blob/main/results_2023-12-13T11-49-19.833498.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6307501582947606,\n \"acc_stderr\": 0.03259733263931875,\n \"acc_norm\": 0.6368306168876743,\n \"acc_norm_stderr\": 0.03325789211362751,\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.44993493459502887,\n \"mc2_stderr\": 0.014562901933937895\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n \"acc_norm\": 0.6262798634812287,\n \"acc_norm_stderr\": 0.014137708601759086\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6461860187213703,\n \"acc_stderr\": 0.004771751187407022,\n \"acc_norm\": 0.8449512049392551,\n \"acc_norm_stderr\": 0.0036121146706989786\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.025305906241590632,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.025305906241590632\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.02446861524147892,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.02446861524147892\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.03086868260412163,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.03086868260412163\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.423841059602649,\n \"acc_stderr\": 0.04034846678603397,\n \"acc_norm\": 0.423841059602649,\n \"acc_norm_stderr\": 0.04034846678603397\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399327,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399327\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728742,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728742\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n \"acc_stderr\": 0.014351702181636863,\n \"acc_norm\": 0.7982120051085568,\n \"acc_norm_stderr\": 0.014351702181636863\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22569832402234638,\n \"acc_stderr\": 0.013981395058455052,\n \"acc_norm\": 0.22569832402234638,\n \"acc_norm_stderr\": 0.013981395058455052\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n \"acc_stderr\": 0.012673969883493274,\n \"acc_norm\": 0.438722294654498,\n \"acc_norm_stderr\": 0.012673969883493274\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6580882352941176,\n \"acc_stderr\": 0.02881472242225418,\n \"acc_norm\": 0.6580882352941176,\n \"acc_norm_stderr\": 0.02881472242225418\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.01920660684882536,\n \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.01920660684882536\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.44993493459502887,\n \"mc2_stderr\": 0.014562901933937895\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.35860500379075055,\n \"acc_stderr\": 0.013210317364134031\n }\n}\n```", "repo_url": "https://huggingface.co/athirdpath/NSFW_DPO_Noromaid-7b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|arc:challenge|25_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|gsm8k|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hellaswag|10_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T11-49-19.833498.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["**/details_harness|winogrande|5_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T11-49-19.833498.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T11_49_19.833498", "path": ["results_2023-12-13T11-49-19.833498.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T11-49-19.833498.parquet"]}]}]} | 2023-12-13T11:52:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of athirdpath/NSFW_DPO_Noromaid-7b
Dataset automatically created during the evaluation run of model athirdpath/NSFW_DPO_Noromaid-7b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T11:49:19.833498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of athirdpath/NSFW_DPO_Noromaid-7b\n\n\n\nDataset automatically created during the evaluation run of model athirdpath/NSFW_DPO_Noromaid-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T11:49:19.833498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of athirdpath/NSFW_DPO_Noromaid-7b\n\n\n\nDataset automatically created during the evaluation run of model athirdpath/NSFW_DPO_Noromaid-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T11:49:19.833498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of athirdpath/NSFW_DPO_Noromaid-7b\n\n\n\nDataset automatically created during the evaluation run of model athirdpath/NSFW_DPO_Noromaid-7b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T11:49:19.833498(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
1ae45175ecba7bb5ec379af33c0d2ac4a6645763 |
# Dataset Card for Evaluation run of beberik/Nyxene-v3-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [beberik/Nyxene-v3-11B](https://huggingface.co/beberik/Nyxene-v3-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_beberik__Nyxene-v3-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T11:54:50.915290](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-v3-11B/blob/main/results_2023-12-13T11-54-50.915290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6504771774813103,
"acc_stderr": 0.03215464413567304,
"acc_norm": 0.6518364388315545,
"acc_norm_stderr": 0.03280459937231609,
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6091044754563715,
"mc2_stderr": 0.015269448129178369
},
"harness|arc:challenge|25": {
"acc": 0.6680887372013652,
"acc_stderr": 0.013760988200880536,
"acc_norm": 0.6962457337883959,
"acc_norm_stderr": 0.013438909184778768
},
"harness|hellaswag|10": {
"acc": 0.6650069707229636,
"acc_stderr": 0.004710234188047365,
"acc_norm": 0.8533160724955188,
"acc_norm_stderr": 0.0035306750148923196
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.03196758697835363,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.03196758697835363
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899092,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899092
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001506,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4134078212290503,
"acc_stderr": 0.016469814928406164,
"acc_norm": 0.4134078212290503,
"acc_norm_stderr": 0.016469814928406164
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358981,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7132352941176471,
"acc_stderr": 0.027472274473233815,
"acc_norm": 0.7132352941176471,
"acc_norm_stderr": 0.027472274473233815
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.02484575321230604,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.02484575321230604
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6091044754563715,
"mc2_stderr": 0.015269448129178369
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.011201862744487054
},
"harness|gsm8k|5": {
"acc": 0.6353297952994693,
"acc_stderr": 0.013258428375662247
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_beberik__Nyxene-v3-11B | [
"region:us"
] | 2023-12-13T11:57:45+00:00 | {"pretty_name": "Evaluation run of beberik/Nyxene-v3-11B", "dataset_summary": "Dataset automatically created during the evaluation run of model [beberik/Nyxene-v3-11B](https://huggingface.co/beberik/Nyxene-v3-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_beberik__Nyxene-v3-11B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T11:54:50.915290](https://huggingface.co/datasets/open-llm-leaderboard/details_beberik__Nyxene-v3-11B/blob/main/results_2023-12-13T11-54-50.915290.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6504771774813103,\n \"acc_stderr\": 0.03215464413567304,\n \"acc_norm\": 0.6518364388315545,\n \"acc_norm_stderr\": 0.03280459937231609,\n \"mc1\": 0.45532435740514077,\n \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6091044754563715,\n \"mc2_stderr\": 0.015269448129178369\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6680887372013652,\n \"acc_stderr\": 0.013760988200880536,\n \"acc_norm\": 0.6962457337883959,\n \"acc_norm_stderr\": 0.013438909184778768\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6650069707229636,\n \"acc_stderr\": 0.004710234188047365,\n \"acc_norm\": 0.8533160724955188,\n \"acc_norm_stderr\": 0.0035306750148923196\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.03196758697835363,\n \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.03196758697835363\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822584,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822584\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n \"acc_stderr\": 0.02023714900899092,\n \"acc_norm\": 0.8931623931623932,\n \"acc_norm_stderr\": 0.02023714900899092\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001506,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001506\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4134078212290503,\n \"acc_stderr\": 0.016469814928406164,\n \"acc_norm\": 0.4134078212290503,\n \"acc_norm_stderr\": 0.016469814928406164\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n \"acc_stderr\": 0.012713845972358981,\n \"acc_norm\": 0.4530638852672751,\n \"acc_norm_stderr\": 0.012713845972358981\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7132352941176471,\n \"acc_stderr\": 0.027472274473233815,\n \"acc_norm\": 0.7132352941176471,\n \"acc_norm_stderr\": 0.027472274473233815\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n \"acc_stderr\": 0.02484575321230604,\n \"acc_norm\": 0.8557213930348259,\n \"acc_norm_stderr\": 0.02484575321230604\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.536144578313253,\n \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6091044754563715,\n \"mc2_stderr\": 0.015269448129178369\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.011201862744487054\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6353297952994693,\n \"acc_stderr\": 0.013258428375662247\n }\n}\n```", "repo_url": "https://huggingface.co/beberik/Nyxene-v3-11B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|arc:challenge|25_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|gsm8k|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hellaswag|10_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T11-54-50.915290.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["**/details_harness|winogrande|5_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T11-54-50.915290.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T11_54_50.915290", "path": ["results_2023-12-13T11-54-50.915290.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T11-54-50.915290.parquet"]}]}]} | 2023-12-13T11:58:27+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of beberik/Nyxene-v3-11B
Dataset automatically created during the evaluation run of model beberik/Nyxene-v3-11B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T11:54:50.915290(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of beberik/Nyxene-v3-11B\n\n\n\nDataset automatically created during the evaluation run of model beberik/Nyxene-v3-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T11:54:50.915290(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of beberik/Nyxene-v3-11B\n\n\n\nDataset automatically created during the evaluation run of model beberik/Nyxene-v3-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T11:54:50.915290(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of beberik/Nyxene-v3-11B\n\n\n\nDataset automatically created during the evaluation run of model beberik/Nyxene-v3-11B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T11:54:50.915290(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
cb38bdd2c35015917c0cf3edc45748c4ffd5c8e2 | # Dataset Card for "shikomori-asr-augmented"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | nairaxo/shikomori-asr-augmented | [
"region:us"
] | 2023-12-13T12:10:19+00:00 | {"dataset_info": {"features": [{"name": "audio", "dtype": "audio"}, {"name": "transcription", "dtype": "string"}, {"name": "duration", "dtype": "float64"}, {"name": "dialect", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 858447006.686, "num_examples": 4926}], "download_size": 988067627, "dataset_size": 858447006.686}} | 2023-12-13T12:10:55+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "shikomori-asr-augmented"
More Information needed | [
"# Dataset Card for \"shikomori-asr-augmented\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"shikomori-asr-augmented\"\n\nMore Information needed"
] | [
6,
20
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"shikomori-asr-augmented\"\n\nMore Information needed"
] |
7230879f58aac5b8272368ae3776f0220a7f2cc8 |
# Dataset of ranger (Azur Lane)
This is the dataset of ranger (Azur Lane), containing 45 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 45 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 120 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 132 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 45 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 45 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 45 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 120 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 120 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 109 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 132 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 132 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| AppleHarem/ranger_azurlane | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-12-13T12:13:41+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-13T17:00:55+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of ranger (Azur Lane)
=============================
This is the dataset of ranger (Azur Lane), containing 45 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
This is a WebUI contains crawlers and other thing: (LittleAppleWebUI)
| [] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] | [
44
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
529cc1d6a644c6182687858e57494b7d379056ae |
```
DatasetDict({
train: Dataset({
features: ['metadata', 'chosen_rating', 'rejected_rating', 'prompt', 'chosen', 'rejected'],
num_rows: 2393
})
test: Dataset({
features: ['metadata', 'chosen_rating', 'rejected_rating', 'prompt', 'chosen', 'rejected'],
num_rows: 25
})
})
```
| kira/math-dpo | [
"region:us"
] | 2023-12-13T12:21:38+00:00 | {"dataset_info": {"features": [{"name": "metadata", "dtype": "string", "id": "metadata"}, {"name": "chosen_rating", "dtype": "float64"}, {"name": "rejected_rating", "dtype": "float64"}, {"name": "prompt", "dtype": "string"}, {"name": "chosen", "dtype": "string"}, {"name": "rejected", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 6976299.638544251, "num_examples": 2393}, {"name": "test", "num_bytes": 72882.36145574856, "num_examples": 25}], "download_size": 3135711, "dataset_size": 7049182.0}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}]} | 2023-12-13T12:25:28+00:00 | [] | [] | TAGS
#region-us
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
|
c8a6355436d54457e8ec61bcb368bd01d1eb025e | # Dataset Card for Dataset Name
Includes Images for different Indian Cities.
## Dataset Details
Each city has 2500 images
### Dataset Description
This dataset contains 2500 images per Cities of popular indian Cities, City included are Ahmendabad, Mumbai, Delhi, Koklakta and A state Kerala.
- **Curated by:** Divax Shah and Team
### Dataset Sources
Google
- **Demo:** [here](https://location-classification-of-indian-cities.streamlit.app/)
| diabolic6045/Images-of-Top-Indian-Cities | [
"task_categories:image-classification",
"size_categories:10K<n<100K",
"license:apache-2.0",
"India",
"Cities",
"Ahmedabad",
"Delhi",
"Kolkata",
"Mumbai",
"Kerala",
"region:us"
] | 2023-12-13T12:32:12+00:00 | {"license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["image-classification"], "tags": ["India", "Cities", "Ahmedabad", "Delhi", "Kolkata", "Mumbai", "Kerala"]} | 2023-12-13T12:54:44+00:00 | [] | [] | TAGS
#task_categories-image-classification #size_categories-10K<n<100K #license-apache-2.0 #India #Cities #Ahmedabad #Delhi #Kolkata #Mumbai #Kerala #region-us
| # Dataset Card for Dataset Name
Includes Images for different Indian Cities.
## Dataset Details
Each city has 2500 images
### Dataset Description
This dataset contains 2500 images per Cities of popular indian Cities, City included are Ahmendabad, Mumbai, Delhi, Koklakta and A state Kerala.
- Curated by: Divax Shah and Team
### Dataset Sources
Google
- Demo: here
| [
"# Dataset Card for Dataset Name\nIncludes Images for different Indian Cities.",
"## Dataset Details\nEach city has 2500 images",
"### Dataset Description\nThis dataset contains 2500 images per Cities of popular indian Cities, City included are Ahmendabad, Mumbai, Delhi, Koklakta and A state Kerala.\n\n- Curated by: Divax Shah and Team",
"### Dataset Sources\nGoogle\n- Demo: here"
] | [
"TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #license-apache-2.0 #India #Cities #Ahmedabad #Delhi #Kolkata #Mumbai #Kerala #region-us \n",
"# Dataset Card for Dataset Name\nIncludes Images for different Indian Cities.",
"## Dataset Details\nEach city has 2500 images",
"### Dataset Description\nThis dataset contains 2500 images per Cities of popular indian Cities, City included are Ahmendabad, Mumbai, Delhi, Koklakta and A state Kerala.\n\n- Curated by: Divax Shah and Team",
"### Dataset Sources\nGoogle\n- Demo: here"
] | [
60,
18,
9,
50,
11
] | [
"passage: TAGS\n#task_categories-image-classification #size_categories-10K<n<100K #license-apache-2.0 #India #Cities #Ahmedabad #Delhi #Kolkata #Mumbai #Kerala #region-us \n# Dataset Card for Dataset Name\nIncludes Images for different Indian Cities.## Dataset Details\nEach city has 2500 images### Dataset Description\nThis dataset contains 2500 images per Cities of popular indian Cities, City included are Ahmendabad, Mumbai, Delhi, Koklakta and A state Kerala.\n\n- Curated by: Divax Shah and Team### Dataset Sources\nGoogle\n- Demo: here"
] |
f48988cf1072477813f95fbe746e5c6ff1fd8541 |
# Dataset Card for Evaluation run of teilomillet/MiniMerlin-3b-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [teilomillet/MiniMerlin-3b-v0.1](https://huggingface.co/teilomillet/MiniMerlin-3b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_teilomillet__MiniMerlin-3b-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T12:30:09.463717](https://huggingface.co/datasets/open-llm-leaderboard/details_teilomillet__MiniMerlin-3b-v0.1/blob/main/results_2023-12-13T12-30-09.463717.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.42829802423091123,
"acc_stderr": 0.034419009383078604,
"acc_norm": 0.4345596062931712,
"acc_norm_stderr": 0.035301959046270974,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133022,
"mc2": 0.49647374974901654,
"mc2_stderr": 0.015915065186614973
},
"harness|arc:challenge|25": {
"acc": 0.38139931740614336,
"acc_stderr": 0.014194389086685261,
"acc_norm": 0.4069965870307167,
"acc_norm_stderr": 0.014356399418009131
},
"harness|hellaswag|10": {
"acc": 0.4343756223859789,
"acc_stderr": 0.004946617138983514,
"acc_norm": 0.5406293567018522,
"acc_norm_stderr": 0.004973280417705513
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.042849586397533994,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.042849586397533994
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.040463368839782486,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.040463368839782486
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47924528301886793,
"acc_stderr": 0.030746349975723463,
"acc_norm": 0.47924528301886793,
"acc_norm_stderr": 0.030746349975723463
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3872832369942196,
"acc_stderr": 0.037143259063020656,
"acc_norm": 0.3872832369942196,
"acc_norm_stderr": 0.037143259063020656
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604673,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604673
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5161290322580645,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.5161290322580645,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5212121212121212,
"acc_stderr": 0.03900828913737302,
"acc_norm": 0.5212121212121212,
"acc_norm_stderr": 0.03900828913737302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5252525252525253,
"acc_stderr": 0.03557806245087314,
"acc_norm": 0.5252525252525253,
"acc_norm_stderr": 0.03557806245087314
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5647668393782384,
"acc_stderr": 0.035780381650085846,
"acc_norm": 0.5647668393782384,
"acc_norm_stderr": 0.035780381650085846
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.382051282051282,
"acc_stderr": 0.024635549163908227,
"acc_norm": 0.382051282051282,
"acc_norm_stderr": 0.024635549163908227
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267613,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267613
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.031041941304059274,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.031041941304059274
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5614678899082569,
"acc_stderr": 0.021274713073954572,
"acc_norm": 0.5614678899082569,
"acc_norm_stderr": 0.021274713073954572
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.25,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03495624522015475,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03495624522015475
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5907172995780591,
"acc_stderr": 0.032007041833595914,
"acc_norm": 0.5907172995780591,
"acc_norm_stderr": 0.032007041833595914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4663677130044843,
"acc_stderr": 0.033481800170603065,
"acc_norm": 0.4663677130044843,
"acc_norm_stderr": 0.033481800170603065
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5,
"acc_stderr": 0.04833682445228318,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04833682445228318
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.5825242718446602,
"acc_stderr": 0.048828405482122375,
"acc_norm": 0.5825242718446602,
"acc_norm_stderr": 0.048828405482122375
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6709401709401709,
"acc_stderr": 0.03078232157768817,
"acc_norm": 0.6709401709401709,
"acc_norm_stderr": 0.03078232157768817
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.49936143039591313,
"acc_stderr": 0.01787994891443168,
"acc_norm": 0.49936143039591313,
"acc_norm_stderr": 0.01787994891443168
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4653179190751445,
"acc_stderr": 0.026854257928258893,
"acc_norm": 0.4653179190751445,
"acc_norm_stderr": 0.026854257928258893
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808862,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808862
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.028614624752805434,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.028614624752805434
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4855305466237942,
"acc_stderr": 0.02838619808417768,
"acc_norm": 0.4855305466237942,
"acc_norm_stderr": 0.02838619808417768
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.45987654320987653,
"acc_stderr": 0.027731022753539274,
"acc_norm": 0.45987654320987653,
"acc_norm_stderr": 0.027731022753539274
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3475177304964539,
"acc_stderr": 0.028406627809590947,
"acc_norm": 0.3475177304964539,
"acc_norm_stderr": 0.028406627809590947
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3533246414602347,
"acc_stderr": 0.012208408211082428,
"acc_norm": 0.3533246414602347,
"acc_norm_stderr": 0.012208408211082428
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.02714627193662517,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.02714627193662517
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4133986928104575,
"acc_stderr": 0.01992211568278667,
"acc_norm": 0.4133986928104575,
"acc_norm_stderr": 0.01992211568278667
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5346938775510204,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.5346938775510204,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5771144278606966,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.5771144278606966,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.43373493975903615,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.43373493975903615,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5029239766081871,
"acc_stderr": 0.03834759370936839,
"acc_norm": 0.5029239766081871,
"acc_norm_stderr": 0.03834759370936839
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133022,
"mc2": 0.49647374974901654,
"mc2_stderr": 0.015915065186614973
},
"harness|winogrande|5": {
"acc": 0.6053670086819258,
"acc_stderr": 0.013736915172371888
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.003195747075480817
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_teilomillet__MiniMerlin-3b-v0.1 | [
"region:us"
] | 2023-12-13T12:33:05+00:00 | {"pretty_name": "Evaluation run of teilomillet/MiniMerlin-3b-v0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [teilomillet/MiniMerlin-3b-v0.1](https://huggingface.co/teilomillet/MiniMerlin-3b-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_teilomillet__MiniMerlin-3b-v0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T12:30:09.463717](https://huggingface.co/datasets/open-llm-leaderboard/details_teilomillet__MiniMerlin-3b-v0.1/blob/main/results_2023-12-13T12-30-09.463717.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.42829802423091123,\n \"acc_stderr\": 0.034419009383078604,\n \"acc_norm\": 0.4345596062931712,\n \"acc_norm_stderr\": 0.035301959046270974,\n \"mc1\": 0.3023255813953488,\n \"mc1_stderr\": 0.016077509266133022,\n \"mc2\": 0.49647374974901654,\n \"mc2_stderr\": 0.015915065186614973\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.38139931740614336,\n \"acc_stderr\": 0.014194389086685261,\n \"acc_norm\": 0.4069965870307167,\n \"acc_norm_stderr\": 0.014356399418009131\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4343756223859789,\n \"acc_stderr\": 0.004946617138983514,\n \"acc_norm\": 0.5406293567018522,\n \"acc_norm_stderr\": 0.004973280417705513\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.042849586397533994,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.042849586397533994\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.040463368839782486,\n \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.040463368839782486\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.47924528301886793,\n \"acc_stderr\": 0.030746349975723463,\n \"acc_norm\": 0.47924528301886793,\n \"acc_norm_stderr\": 0.030746349975723463\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3872832369942196,\n \"acc_stderr\": 0.037143259063020656,\n \"acc_norm\": 0.3872832369942196,\n \"acc_norm_stderr\": 0.037143259063020656\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231,\n \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n \"acc_stderr\": 0.03893259610604673,\n \"acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.03893259610604673\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5161290322580645,\n \"acc_stderr\": 0.028429203176724555,\n \"acc_norm\": 0.5161290322580645,\n \"acc_norm_stderr\": 0.028429203176724555\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5212121212121212,\n \"acc_stderr\": 0.03900828913737302,\n \"acc_norm\": 0.5212121212121212,\n \"acc_norm_stderr\": 0.03900828913737302\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5252525252525253,\n \"acc_stderr\": 0.03557806245087314,\n \"acc_norm\": 0.5252525252525253,\n \"acc_norm_stderr\": 0.03557806245087314\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5647668393782384,\n \"acc_stderr\": 0.035780381650085846,\n \"acc_norm\": 0.5647668393782384,\n \"acc_norm_stderr\": 0.035780381650085846\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.382051282051282,\n \"acc_stderr\": 0.024635549163908227,\n \"acc_norm\": 0.382051282051282,\n \"acc_norm_stderr\": 0.024635549163908227\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267613,\n \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267613\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.031041941304059274,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.031041941304059274\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.5614678899082569,\n \"acc_stderr\": 0.021274713073954572,\n \"acc_norm\": 0.5614678899082569,\n \"acc_norm_stderr\": 0.021274713073954572\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015475,\n \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015475\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5907172995780591,\n \"acc_stderr\": 0.032007041833595914,\n \"acc_norm\": 0.5907172995780591,\n \"acc_norm_stderr\": 0.032007041833595914\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4663677130044843,\n \"acc_stderr\": 0.033481800170603065,\n \"acc_norm\": 0.4663677130044843,\n \"acc_norm_stderr\": 0.033481800170603065\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.04833682445228318,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.04833682445228318\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.5825242718446602,\n \"acc_stderr\": 0.048828405482122375,\n \"acc_norm\": 0.5825242718446602,\n \"acc_norm_stderr\": 0.048828405482122375\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6709401709401709,\n \"acc_stderr\": 0.03078232157768817,\n \"acc_norm\": 0.6709401709401709,\n \"acc_norm_stderr\": 0.03078232157768817\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.49936143039591313,\n \"acc_stderr\": 0.01787994891443168,\n \"acc_norm\": 0.49936143039591313,\n \"acc_norm_stderr\": 0.01787994891443168\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4653179190751445,\n \"acc_stderr\": 0.026854257928258893,\n \"acc_norm\": 0.4653179190751445,\n \"acc_norm_stderr\": 0.026854257928258893\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808862,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808862\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.028614624752805434,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.028614624752805434\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4855305466237942,\n \"acc_stderr\": 0.02838619808417768,\n \"acc_norm\": 0.4855305466237942,\n \"acc_norm_stderr\": 0.02838619808417768\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.45987654320987653,\n \"acc_stderr\": 0.027731022753539274,\n \"acc_norm\": 0.45987654320987653,\n \"acc_norm_stderr\": 0.027731022753539274\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3475177304964539,\n \"acc_stderr\": 0.028406627809590947,\n \"acc_norm\": 0.3475177304964539,\n \"acc_norm_stderr\": 0.028406627809590947\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3533246414602347,\n \"acc_stderr\": 0.012208408211082428,\n \"acc_norm\": 0.3533246414602347,\n \"acc_norm_stderr\": 0.012208408211082428\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.02714627193662517,\n \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.02714627193662517\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4133986928104575,\n \"acc_stderr\": 0.01992211568278667,\n \"acc_norm\": 0.4133986928104575,\n \"acc_norm_stderr\": 0.01992211568278667\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5346938775510204,\n \"acc_stderr\": 0.03193207024425314,\n \"acc_norm\": 0.5346938775510204,\n \"acc_norm_stderr\": 0.03193207024425314\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.5771144278606966,\n \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.43373493975903615,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.43373493975903615,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5029239766081871,\n \"acc_stderr\": 0.03834759370936839,\n \"acc_norm\": 0.5029239766081871,\n \"acc_norm_stderr\": 0.03834759370936839\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n \"mc1_stderr\": 0.016077509266133022,\n \"mc2\": 0.49647374974901654,\n \"mc2_stderr\": 0.015915065186614973\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6053670086819258,\n \"acc_stderr\": 0.013736915172371888\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \"acc_stderr\": 0.003195747075480817\n }\n}\n```", "repo_url": "https://huggingface.co/teilomillet/MiniMerlin-3b-v0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|arc:challenge|25_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|gsm8k|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hellaswag|10_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T12-30-09.463717.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["**/details_harness|winogrande|5_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T12-30-09.463717.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T12_30_09.463717", "path": ["results_2023-12-13T12-30-09.463717.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T12-30-09.463717.parquet"]}]}]} | 2023-12-13T12:33:52+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of teilomillet/MiniMerlin-3b-v0.1
Dataset automatically created during the evaluation run of model teilomillet/MiniMerlin-3b-v0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T12:30:09.463717(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of teilomillet/MiniMerlin-3b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model teilomillet/MiniMerlin-3b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T12:30:09.463717(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of teilomillet/MiniMerlin-3b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model teilomillet/MiniMerlin-3b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T12:30:09.463717(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of teilomillet/MiniMerlin-3b-v0.1\n\n\n\nDataset automatically created during the evaluation run of model teilomillet/MiniMerlin-3b-v0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T12:30:09.463717(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
fbfc2e1b8fa410544a6a248de24e48452c076431 |
# Dataset Card for Evaluation run of malhajar/meditron-7b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [malhajar/meditron-7b-chat](https://huggingface.co/malhajar/meditron-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_malhajar__meditron-7b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T12:44:32.691414](https://huggingface.co/datasets/open-llm-leaderboard/details_malhajar__meditron-7b-chat/blob/main/results_2023-12-13T12-44-32.691414.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4088088785737693,
"acc_stderr": 0.03432891874934368,
"acc_norm": 0.412520814098851,
"acc_norm_stderr": 0.03513603001068187,
"mc1": 0.32802937576499386,
"mc1_stderr": 0.016435632932815032,
"mc2": 0.48561313890109503,
"mc2_stderr": 0.014556131200430611
},
"harness|arc:challenge|25": {
"acc": 0.47440273037542663,
"acc_stderr": 0.014592230885298964,
"acc_norm": 0.507679180887372,
"acc_norm_stderr": 0.014609667440892574
},
"harness|hellaswag|10": {
"acc": 0.5622385978888668,
"acc_stderr": 0.004950973231188741,
"acc_norm": 0.753734315873332,
"acc_norm_stderr": 0.004299546103761425
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.35526315789473684,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.35526315789473684,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.44528301886792454,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.44528301886792454,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3236994219653179,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.3236994219653179,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3586206896551724,
"acc_stderr": 0.03996629574876719,
"acc_norm": 0.3586206896551724,
"acc_norm_stderr": 0.03996629574876719
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.023266512213730575,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.023266512213730575
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4774193548387097,
"acc_stderr": 0.028414985019707868,
"acc_norm": 0.4774193548387097,
"acc_norm_stderr": 0.028414985019707868
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.032550867699701024,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.032550867699701024
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.03903698647748441,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.03903698647748441
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.035402943770953675,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.035402943770953675
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5751295336787565,
"acc_stderr": 0.035674713352125395,
"acc_norm": 0.5751295336787565,
"acc_norm_stderr": 0.035674713352125395
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.031204691225150013,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.031204691225150013
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4972477064220184,
"acc_stderr": 0.021436998359765324,
"acc_norm": 0.4972477064220184,
"acc_norm_stderr": 0.021436998359765324
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828978,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828978
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47549019607843135,
"acc_stderr": 0.035050931943487976,
"acc_norm": 0.47549019607843135,
"acc_norm_stderr": 0.035050931943487976
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5864978902953587,
"acc_stderr": 0.03205649904851859,
"acc_norm": 0.5864978902953587,
"acc_norm_stderr": 0.03205649904851859
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.45739910313901344,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.45739910313901344,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.043171711948702556,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.043171711948702556
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6694214876033058,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.6694214876033058,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.39263803680981596,
"acc_stderr": 0.03836740907831028,
"acc_norm": 0.39263803680981596,
"acc_norm_stderr": 0.03836740907831028
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.4174757281553398,
"acc_stderr": 0.04882840548212238,
"acc_norm": 0.4174757281553398,
"acc_norm_stderr": 0.04882840548212238
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.032366121762202014,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.032366121762202014
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5159642401021711,
"acc_stderr": 0.017870847506081738,
"acc_norm": 0.5159642401021711,
"acc_norm_stderr": 0.017870847506081738
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4190751445086705,
"acc_stderr": 0.026564178111422622,
"acc_norm": 0.4190751445086705,
"acc_norm_stderr": 0.026564178111422622
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.02818059632825929,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.02818059632825929
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.45016077170418006,
"acc_stderr": 0.02825666072336019,
"acc_norm": 0.45016077170418006,
"acc_norm_stderr": 0.02825666072336019
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4351851851851852,
"acc_stderr": 0.027586006221607697,
"acc_norm": 0.4351851851851852,
"acc_norm_stderr": 0.027586006221607697
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.027807990141320193,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.027807990141320193
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3259452411994785,
"acc_stderr": 0.011971507294982779,
"acc_norm": 0.3259452411994785,
"acc_norm_stderr": 0.011971507294982779
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.43014705882352944,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.43014705882352944,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.01994491413687358,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.01994491413687358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.35918367346938773,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.35918367346938773,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.4577114427860697,
"acc_stderr": 0.035228658640995975,
"acc_norm": 0.4577114427860697,
"acc_norm_stderr": 0.035228658640995975
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5906432748538012,
"acc_stderr": 0.03771283107626545,
"acc_norm": 0.5906432748538012,
"acc_norm_stderr": 0.03771283107626545
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32802937576499386,
"mc1_stderr": 0.016435632932815032,
"mc2": 0.48561313890109503,
"mc2_stderr": 0.014556131200430611
},
"harness|winogrande|5": {
"acc": 0.7316495659037096,
"acc_stderr": 0.012453340359561195
},
"harness|gsm8k|5": {
"acc": 0.09173616376042457,
"acc_stderr": 0.007950942148339342
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_malhajar__meditron-7b-chat | [
"region:us"
] | 2023-12-13T12:46:50+00:00 | {"pretty_name": "Evaluation run of malhajar/meditron-7b-chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [malhajar/meditron-7b-chat](https://huggingface.co/malhajar/meditron-7b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_malhajar__meditron-7b-chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T12:44:32.691414](https://huggingface.co/datasets/open-llm-leaderboard/details_malhajar__meditron-7b-chat/blob/main/results_2023-12-13T12-44-32.691414.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4088088785737693,\n \"acc_stderr\": 0.03432891874934368,\n \"acc_norm\": 0.412520814098851,\n \"acc_norm_stderr\": 0.03513603001068187,\n \"mc1\": 0.32802937576499386,\n \"mc1_stderr\": 0.016435632932815032,\n \"mc2\": 0.48561313890109503,\n \"mc2_stderr\": 0.014556131200430611\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.47440273037542663,\n \"acc_stderr\": 0.014592230885298964,\n \"acc_norm\": 0.507679180887372,\n \"acc_norm_stderr\": 0.014609667440892574\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5622385978888668,\n \"acc_stderr\": 0.004950973231188741,\n \"acc_norm\": 0.753734315873332,\n \"acc_norm_stderr\": 0.004299546103761425\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.35526315789473684,\n \"acc_stderr\": 0.038947344870133176,\n \"acc_norm\": 0.35526315789473684,\n \"acc_norm_stderr\": 0.038947344870133176\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.44528301886792454,\n \"acc_stderr\": 0.030588052974270655,\n \"acc_norm\": 0.44528301886792454,\n \"acc_norm_stderr\": 0.030588052974270655\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3236994219653179,\n \"acc_stderr\": 0.035676037996391706,\n \"acc_norm\": 0.3236994219653179,\n \"acc_norm_stderr\": 0.035676037996391706\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.3586206896551724,\n \"acc_stderr\": 0.03996629574876719,\n \"acc_norm\": 0.3586206896551724,\n \"acc_norm_stderr\": 0.03996629574876719\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.023266512213730575,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.023266512213730575\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4774193548387097,\n \"acc_stderr\": 0.028414985019707868,\n \"acc_norm\": 0.4774193548387097,\n \"acc_norm_stderr\": 0.028414985019707868\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.032550867699701024,\n \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.032550867699701024\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.4909090909090909,\n \"acc_stderr\": 0.03903698647748441,\n \"acc_norm\": 0.4909090909090909,\n \"acc_norm_stderr\": 0.03903698647748441\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.035402943770953675,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.035402943770953675\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.5751295336787565,\n \"acc_stderr\": 0.035674713352125395,\n \"acc_norm\": 0.5751295336787565,\n \"acc_norm_stderr\": 0.035674713352125395\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.031204691225150013,\n \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.031204691225150013\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.4972477064220184,\n \"acc_stderr\": 0.021436998359765324,\n \"acc_norm\": 0.4972477064220184,\n \"acc_norm_stderr\": 0.021436998359765324\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828978,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828978\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.47549019607843135,\n \"acc_stderr\": 0.035050931943487976,\n \"acc_norm\": 0.47549019607843135,\n \"acc_norm_stderr\": 0.035050931943487976\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.5864978902953587,\n \"acc_stderr\": 0.03205649904851859,\n \"acc_norm\": 0.5864978902953587,\n \"acc_norm_stderr\": 0.03205649904851859\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.45739910313901344,\n \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.45739910313901344,\n \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.043171711948702556,\n \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.043171711948702556\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6694214876033058,\n \"acc_stderr\": 0.04294340845212094,\n \"acc_norm\": 0.6694214876033058,\n \"acc_norm_stderr\": 0.04294340845212094\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.39263803680981596,\n \"acc_stderr\": 0.03836740907831028,\n \"acc_norm\": 0.39263803680981596,\n \"acc_norm_stderr\": 0.03836740907831028\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.4174757281553398,\n \"acc_stderr\": 0.04882840548212238,\n \"acc_norm\": 0.4174757281553398,\n \"acc_norm_stderr\": 0.04882840548212238\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5769230769230769,\n \"acc_stderr\": 0.032366121762202014,\n \"acc_norm\": 0.5769230769230769,\n \"acc_norm_stderr\": 0.032366121762202014\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5159642401021711,\n \"acc_stderr\": 0.017870847506081738,\n \"acc_norm\": 0.5159642401021711,\n \"acc_norm_stderr\": 0.017870847506081738\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.4190751445086705,\n \"acc_stderr\": 0.026564178111422622,\n \"acc_norm\": 0.4190751445086705,\n \"acc_norm_stderr\": 0.026564178111422622\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.02818059632825929,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.02818059632825929\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.45016077170418006,\n \"acc_stderr\": 0.02825666072336019,\n \"acc_norm\": 0.45016077170418006,\n \"acc_norm_stderr\": 0.02825666072336019\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4351851851851852,\n \"acc_stderr\": 0.027586006221607697,\n \"acc_norm\": 0.4351851851851852,\n \"acc_norm_stderr\": 0.027586006221607697\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320193,\n \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320193\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3259452411994785,\n \"acc_stderr\": 0.011971507294982779,\n \"acc_norm\": 0.3259452411994785,\n \"acc_norm_stderr\": 0.011971507294982779\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.43014705882352944,\n \"acc_stderr\": 0.030074971917302875,\n \"acc_norm\": 0.43014705882352944,\n \"acc_norm_stderr\": 0.030074971917302875\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.01994491413687358,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.01994491413687358\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.35918367346938773,\n \"acc_stderr\": 0.030713560455108493,\n \"acc_norm\": 0.35918367346938773,\n \"acc_norm_stderr\": 0.030713560455108493\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.4577114427860697,\n \"acc_stderr\": 0.035228658640995975,\n \"acc_norm\": 0.4577114427860697,\n \"acc_norm_stderr\": 0.035228658640995975\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.40963855421686746,\n \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.5906432748538012,\n \"acc_stderr\": 0.03771283107626545,\n \"acc_norm\": 0.5906432748538012,\n \"acc_norm_stderr\": 0.03771283107626545\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32802937576499386,\n \"mc1_stderr\": 0.016435632932815032,\n \"mc2\": 0.48561313890109503,\n \"mc2_stderr\": 0.014556131200430611\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7316495659037096,\n \"acc_stderr\": 0.012453340359561195\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09173616376042457,\n \"acc_stderr\": 0.007950942148339342\n }\n}\n```", "repo_url": "https://huggingface.co/malhajar/meditron-7b-chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|arc:challenge|25_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|gsm8k|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hellaswag|10_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T12-44-32.691414.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["**/details_harness|winogrande|5_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T12-44-32.691414.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T12_44_32.691414", "path": ["results_2023-12-13T12-44-32.691414.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T12-44-32.691414.parquet"]}]}]} | 2023-12-13T12:47:37+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of malhajar/meditron-7b-chat
Dataset automatically created during the evaluation run of model malhajar/meditron-7b-chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T12:44:32.691414(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of malhajar/meditron-7b-chat\n\n\n\nDataset automatically created during the evaluation run of model malhajar/meditron-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T12:44:32.691414(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of malhajar/meditron-7b-chat\n\n\n\nDataset automatically created during the evaluation run of model malhajar/meditron-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T12:44:32.691414(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of malhajar/meditron-7b-chat\n\n\n\nDataset automatically created during the evaluation run of model malhajar/meditron-7b-chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T12:44:32.691414(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
c5ba4d03e751b4f9535fd86f93612bba8c9caaf1 |
# Dataset Card for Evaluation run of martyn/llama-megamerge-dare-13b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [martyn/llama-megamerge-dare-13b](https://huggingface.co/martyn/llama-megamerge-dare-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_martyn__llama-megamerge-dare-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T12:51:07.798960](https://huggingface.co/datasets/open-llm-leaderboard/details_martyn__llama-megamerge-dare-13b/blob/main/results_2023-12-13T12-51-07.798960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5501426946833134,
"acc_stderr": 0.033816613061950815,
"acc_norm": 0.5548026687947745,
"acc_norm_stderr": 0.0345287472758958,
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.4575555939890639,
"mc2_stderr": 0.015024821972393557
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464396,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.014280522667467325
},
"harness|hellaswag|10": {
"acc": 0.6385182234614618,
"acc_stderr": 0.004794478426382609,
"acc_norm": 0.8300139414459271,
"acc_norm_stderr": 0.0037485288878381204
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6113207547169811,
"acc_stderr": 0.030000485448675986,
"acc_norm": 0.6113207547169811,
"acc_norm_stderr": 0.030000485448675986
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342654,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342654
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6193548387096774,
"acc_stderr": 0.027621717832907032,
"acc_norm": 0.6193548387096774,
"acc_norm_stderr": 0.027621717832907032
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6787878787878788,
"acc_stderr": 0.0364620496325381,
"acc_norm": 0.6787878787878788,
"acc_norm_stderr": 0.0364620496325381
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7927461139896373,
"acc_stderr": 0.02925282329180363,
"acc_norm": 0.7927461139896373,
"acc_norm_stderr": 0.02925282329180363
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.02742001935094528,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.02742001935094528
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5966386554621849,
"acc_stderr": 0.03186608121408832,
"acc_norm": 0.5966386554621849,
"acc_norm_stderr": 0.03186608121408832
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.03362277436608044,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.03362277436608044
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501943,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501943
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7468354430379747,
"acc_stderr": 0.028304657943035296,
"acc_norm": 0.7468354430379747,
"acc_norm_stderr": 0.028304657943035296
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5954198473282443,
"acc_stderr": 0.043046937953806645,
"acc_norm": 0.5954198473282443,
"acc_norm_stderr": 0.043046937953806645
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884122,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884122
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465918,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465918
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7305236270753512,
"acc_stderr": 0.015866243073215075,
"acc_norm": 0.7305236270753512,
"acc_norm_stderr": 0.015866243073215075
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654082,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3407821229050279,
"acc_stderr": 0.01585200244986209,
"acc_norm": 0.3407821229050279,
"acc_norm_stderr": 0.01585200244986209
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.02811092849280907,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.02811092849280907
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6463022508038585,
"acc_stderr": 0.027155208103200868,
"acc_norm": 0.6463022508038585,
"acc_norm_stderr": 0.027155208103200868
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.43617021276595747,
"acc_stderr": 0.029583452036284066,
"acc_norm": 0.43617021276595747,
"acc_norm_stderr": 0.029583452036284066
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4178617992177314,
"acc_stderr": 0.01259674410899856,
"acc_norm": 0.4178617992177314,
"acc_norm_stderr": 0.01259674410899856
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.02017061497496976,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.02017061497496976
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6081632653061224,
"acc_stderr": 0.031251275910891656,
"acc_norm": 0.6081632653061224,
"acc_norm_stderr": 0.031251275910891656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355042,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3243574051407589,
"mc1_stderr": 0.01638797677964794,
"mc2": 0.4575555939890639,
"mc2_stderr": 0.015024821972393557
},
"harness|winogrande|5": {
"acc": 0.7616416732438832,
"acc_stderr": 0.011974948667702304
},
"harness|gsm8k|5": {
"acc": 0.2850644427596664,
"acc_stderr": 0.012435042334904004
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_martyn__llama-megamerge-dare-13b | [
"region:us"
] | 2023-12-13T12:54:06+00:00 | {"pretty_name": "Evaluation run of martyn/llama-megamerge-dare-13b", "dataset_summary": "Dataset automatically created during the evaluation run of model [martyn/llama-megamerge-dare-13b](https://huggingface.co/martyn/llama-megamerge-dare-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_martyn__llama-megamerge-dare-13b\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T12:51:07.798960](https://huggingface.co/datasets/open-llm-leaderboard/details_martyn__llama-megamerge-dare-13b/blob/main/results_2023-12-13T12-51-07.798960.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5501426946833134,\n \"acc_stderr\": 0.033816613061950815,\n \"acc_norm\": 0.5548026687947745,\n \"acc_norm_stderr\": 0.0345287472758958,\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4575555939890639,\n \"mc2_stderr\": 0.015024821972393557\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464396,\n \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467325\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6385182234614618,\n \"acc_stderr\": 0.004794478426382609,\n \"acc_norm\": 0.8300139414459271,\n \"acc_norm_stderr\": 0.0037485288878381204\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6113207547169811,\n \"acc_stderr\": 0.030000485448675986,\n \"acc_norm\": 0.6113207547169811,\n \"acc_norm_stderr\": 0.030000485448675986\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.03255525359340355,\n \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.03255525359340355\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342654,\n \"acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342654\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6193548387096774,\n \"acc_stderr\": 0.027621717832907032,\n \"acc_norm\": 0.6193548387096774,\n \"acc_norm_stderr\": 0.027621717832907032\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6787878787878788,\n \"acc_stderr\": 0.0364620496325381,\n \"acc_norm\": 0.6787878787878788,\n \"acc_norm_stderr\": 0.0364620496325381\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.7927461139896373,\n \"acc_stderr\": 0.02925282329180363,\n \"acc_norm\": 0.7927461139896373,\n \"acc_norm_stderr\": 0.02925282329180363\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412195,\n \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412195\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2814814814814815,\n \"acc_stderr\": 0.02742001935094528,\n \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.02742001935094528\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5966386554621849,\n \"acc_stderr\": 0.03186608121408832,\n \"acc_norm\": 0.5966386554621849,\n \"acc_norm_stderr\": 0.03186608121408832\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.03362277436608044,\n \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.03362277436608044\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501943,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501943\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7468354430379747,\n \"acc_stderr\": 0.028304657943035296,\n \"acc_norm\": 0.7468354430379747,\n \"acc_norm_stderr\": 0.028304657943035296\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5954198473282443,\n \"acc_stderr\": 0.043046937953806645,\n \"acc_norm\": 0.5954198473282443,\n \"acc_norm_stderr\": 0.043046937953806645\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884122,\n \"acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884122\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.036429145782924055,\n \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.036429145782924055\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465918,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465918\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7305236270753512,\n \"acc_stderr\": 0.015866243073215075,\n \"acc_norm\": 0.7305236270753512,\n \"acc_norm_stderr\": 0.015866243073215075\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654082,\n \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654082\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3407821229050279,\n \"acc_stderr\": 0.01585200244986209,\n \"acc_norm\": 0.3407821229050279,\n \"acc_norm_stderr\": 0.01585200244986209\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.02811092849280907,\n \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.02811092849280907\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6463022508038585,\n \"acc_stderr\": 0.027155208103200868,\n \"acc_norm\": 0.6463022508038585,\n \"acc_norm_stderr\": 0.027155208103200868\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.43617021276595747,\n \"acc_stderr\": 0.029583452036284066,\n \"acc_norm\": 0.43617021276595747,\n \"acc_norm_stderr\": 0.029583452036284066\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4178617992177314,\n \"acc_stderr\": 0.01259674410899856,\n \"acc_norm\": 0.4178617992177314,\n \"acc_norm_stderr\": 0.01259674410899856\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5375816993464052,\n \"acc_stderr\": 0.02017061497496976,\n \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.02017061497496976\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6081632653061224,\n \"acc_stderr\": 0.031251275910891656,\n \"acc_norm\": 0.6081632653061224,\n \"acc_norm_stderr\": 0.031251275910891656\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n \"acc_stderr\": 0.03235743789355042,\n \"acc_norm\": 0.7014925373134329,\n \"acc_norm_stderr\": 0.03235743789355042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3243574051407589,\n \"mc1_stderr\": 0.01638797677964794,\n \"mc2\": 0.4575555939890639,\n \"mc2_stderr\": 0.015024821972393557\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7616416732438832,\n \"acc_stderr\": 0.011974948667702304\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2850644427596664,\n \"acc_stderr\": 0.012435042334904004\n }\n}\n```", "repo_url": "https://huggingface.co/martyn/llama-megamerge-dare-13b", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|arc:challenge|25_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|gsm8k|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hellaswag|10_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T12-51-07.798960.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["**/details_harness|winogrande|5_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T12-51-07.798960.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T12_51_07.798960", "path": ["results_2023-12-13T12-51-07.798960.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T12-51-07.798960.parquet"]}]}]} | 2023-12-13T12:54:48+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of martyn/llama-megamerge-dare-13b
Dataset automatically created during the evaluation run of model martyn/llama-megamerge-dare-13b on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T12:51:07.798960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of martyn/llama-megamerge-dare-13b\n\n\n\nDataset automatically created during the evaluation run of model martyn/llama-megamerge-dare-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T12:51:07.798960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of martyn/llama-megamerge-dare-13b\n\n\n\nDataset automatically created during the evaluation run of model martyn/llama-megamerge-dare-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T12:51:07.798960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of martyn/llama-megamerge-dare-13b\n\n\n\nDataset automatically created during the evaluation run of model martyn/llama-megamerge-dare-13b on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T12:51:07.798960(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
d531047d3c1b9ee793bfb71ada7f8fe7d883172a |
# Dataset of durin (Arknights)
This is the dataset of durin (Arknights), containing 55 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 55 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 133 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 147 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 55 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 55 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 55 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 133 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 133 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 84 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 147 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 147 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| AppleHarem/durin_arknights | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-12-13T13:03:00+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-13T17:45:39+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of durin (Arknights)
============================
This is the dataset of durin (Arknights), containing 55 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
This is a WebUI contains crawlers and other thing: (LittleAppleWebUI)
| [] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] | [
44
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
02e5c4682b581447e8714e04da59215648447b26 | # Dataset Card for "blemish-binary-2k"
## Dataset summary
This is a dataset of square images of 224x224 resolution of citrus fruit
in various contexts. Each images is focused on a single fruit, but contains a
buffered region around it. This dataset is currently only contains labels of nominal
and blemished (properly class 4-8).
## Supported Tasks and Leaderboards
* image-classification: The goal of this task is to classify a given fruit in the image into a blemish class. A secondary goal is to classify the image as valid or not. An invalid image is disqualified from evaluation by virtue of lacking clear signal.
## Languages
English
## Dataset Structure
### Data Instances
A sample from the training set is provided below:
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=224x224 at 0x12966ADF0>,
'blemish_label': 3,
'image_quality_label': 6,
'orchard_id': 486870,
'ccg_id': 2181253,
'ccg_year': 2023,
'ccg_week': 21,
'crop_id': 5,
'cultivar_id': 29,
'ml_model_version_id': 224,
'cc_id': 6249913,
'camera_type': 'tele',
's3_path_to_image': 's3://fruit-engine/v2/infield-images/cam-captures/orchards/486870/session-a9cd7f65-59bb-4323-8d3d-a05397130b50/90af3d55-ac12-423d-86ca-324c79f98c1d_tele.jpeg',
's3_path_to_geojson': 's3://fruit-engine/v2/annotations/model/224/orchards/486870/224_cca685d5-005b-4c3f-adf5-adbb3fbfa9c2_tele.geojson',
'annotation_index': 0,
'annotation_px_area': 27491.115234375,
'img_filename': 'cc_6249913_model_224_index_0.jpeg',
'annotation_filename': 'cc_6249913_model_224_index_0.json',
'annotation_filepath': 'datasets/blemishes_10k/data/annotations/cc_6249913_model_224_index_0.json'
}
```
### Data Fields
The data instances have the following fields:
* image: A PIL.Image.Image object containing the image. Note that when accessing the image column: dataset[0]["image"] the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the "image" column, i.e. dataset[0]["image"] should always be preferred over dataset["image"][0].
* blemish_label: an int classification label.
* image_quality_label: an int classification label.
* orchard_id: an int denoting the source orchard id
* ccg_id': 2181253,
* ccg_year: an int for the year the image was taken
* ccg_week: an int for the week the image was taken
* crop_id: Aerobotics crop type id
* cultivar_id': Aerobotics cultivar type id
* ml_model_version_id: an int representing the fruit detection model id; this will be 'nursery' if it was human annotated
* cc_id: an int for the id of the camera image
* camera_type: string for the iPhone camera type
* s3_path_to_image: string for the path to the source image stored in S3
* s3_path_to_geojson: string for the path to the source annotation geojson stored in S3
* annotation_index: an int representing the index of the annotation in the file from the fruit detection outputs
* annotation_px_area: a float of the size of the annotation in image space
* img_filename: string image file name
* annotation_filename': string of the annotation file name
* annotation_filepath': string of the annotation file path
| Aerobotics/blemish-binary-2k | [
"region:us"
] | 2023-12-13T13:05:33+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "validation", "path": "data/validation-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "blemish_label", "dtype": {"class_label": {"names": {"0": "nominal", "1": "blemish_cat_2_3", "2": "blemish_cat_4_to_8", "3": "unlabelled"}}}}, {"name": "image_quality_label", "dtype": {"class_label": {"names": {"0": "image_valid", "1": "low_resolution", "2": "blur", "3": "occlusion_or_crop", "4": "other", "5": "sunblock", "6": "unlabelled"}}}}, {"name": "orchard_id", "dtype": "int32"}, {"name": "ccg_id", "dtype": "int32"}, {"name": "ccg_year", "dtype": "int32"}, {"name": "ccg_week", "dtype": "int32"}, {"name": "crop_id", "dtype": "int32"}, {"name": "cultivar_id", "dtype": "int32"}, {"name": "ml_model_version_id", "dtype": "int32"}, {"name": "cc_id", "dtype": "int32"}, {"name": "camera_type", "dtype": "string"}, {"name": "s3_path_to_image", "dtype": "string"}, {"name": "s3_path_to_geojson", "dtype": "string"}, {"name": "annotation_index", "dtype": "int32"}, {"name": "annotation_px_area", "dtype": "float32"}, {"name": "img_filename", "dtype": "string"}, {"name": "annotation_filename", "dtype": "string"}, {"name": "annotation_filepath", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 18658798.0, "num_examples": 834}, {"name": "validation", "num_bytes": 5753068.0, "num_examples": 256}, {"name": "test", "num_bytes": 4303557.0, "num_examples": 194}], "download_size": 0, "dataset_size": 28715423.0}} | 2023-12-19T07:32:58+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "blemish-binary-2k"
## Dataset summary
This is a dataset of square images of 224x224 resolution of citrus fruit
in various contexts. Each images is focused on a single fruit, but contains a
buffered region around it. This dataset is currently only contains labels of nominal
and blemished (properly class 4-8).
## Supported Tasks and Leaderboards
* image-classification: The goal of this task is to classify a given fruit in the image into a blemish class. A secondary goal is to classify the image as valid or not. An invalid image is disqualified from evaluation by virtue of lacking clear signal.
## Languages
English
## Dataset Structure
### Data Instances
A sample from the training set is provided below:
### Data Fields
The data instances have the following fields:
* image: A PIL.Image.Image object containing the image. Note that when accessing the image column: dataset[0]["image"] the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the "image" column, i.e. dataset[0]["image"] should always be preferred over dataset["image"][0].
* blemish_label: an int classification label.
* image_quality_label: an int classification label.
* orchard_id: an int denoting the source orchard id
* ccg_id': 2181253,
* ccg_year: an int for the year the image was taken
* ccg_week: an int for the week the image was taken
* crop_id: Aerobotics crop type id
* cultivar_id': Aerobotics cultivar type id
* ml_model_version_id: an int representing the fruit detection model id; this will be 'nursery' if it was human annotated
* cc_id: an int for the id of the camera image
* camera_type: string for the iPhone camera type
* s3_path_to_image: string for the path to the source image stored in S3
* s3_path_to_geojson: string for the path to the source annotation geojson stored in S3
* annotation_index: an int representing the index of the annotation in the file from the fruit detection outputs
* annotation_px_area: a float of the size of the annotation in image space
* img_filename: string image file name
* annotation_filename': string of the annotation file name
* annotation_filepath': string of the annotation file path
| [
"# Dataset Card for \"blemish-binary-2k\"",
"## Dataset summary\n\nThis is a dataset of square images of 224x224 resolution of citrus fruit\nin various contexts. Each images is focused on a single fruit, but contains a\nbuffered region around it. This dataset is currently only contains labels of nominal\nand blemished (properly class 4-8).",
"## Supported Tasks and Leaderboards\n\n * image-classification: The goal of this task is to classify a given fruit in the image into a blemish class. A secondary goal is to classify the image as valid or not. An invalid image is disqualified from evaluation by virtue of lacking clear signal.",
"## Languages\n\nEnglish",
"## Dataset Structure",
"### Data Instances\n\nA sample from the training set is provided below:",
"### Data Fields\n\nThe data instances have the following fields:\n\n * image: A PIL.Image.Image object containing the image. Note that when accessing the image column: dataset[0][\"image\"] the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the \"image\" column, i.e. dataset[0][\"image\"] should always be preferred over dataset[\"image\"][0].\n * blemish_label: an int classification label.\n * image_quality_label: an int classification label.\n * orchard_id: an int denoting the source orchard id\n * ccg_id': 2181253,\n * ccg_year: an int for the year the image was taken\n * ccg_week: an int for the week the image was taken\n * crop_id: Aerobotics crop type id\n * cultivar_id': Aerobotics cultivar type id\n * ml_model_version_id: an int representing the fruit detection model id; this will be 'nursery' if it was human annotated\n * cc_id: an int for the id of the camera image\n * camera_type: string for the iPhone camera type\n * s3_path_to_image: string for the path to the source image stored in S3\n * s3_path_to_geojson: string for the path to the source annotation geojson stored in S3\n * annotation_index: an int representing the index of the annotation in the file from the fruit detection outputs\n * annotation_px_area: a float of the size of the annotation in image space\n * img_filename: string image file name\n * annotation_filename': string of the annotation file name\n * annotation_filepath': string of the annotation file path"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"blemish-binary-2k\"",
"## Dataset summary\n\nThis is a dataset of square images of 224x224 resolution of citrus fruit\nin various contexts. Each images is focused on a single fruit, but contains a\nbuffered region around it. This dataset is currently only contains labels of nominal\nand blemished (properly class 4-8).",
"## Supported Tasks and Leaderboards\n\n * image-classification: The goal of this task is to classify a given fruit in the image into a blemish class. A secondary goal is to classify the image as valid or not. An invalid image is disqualified from evaluation by virtue of lacking clear signal.",
"## Languages\n\nEnglish",
"## Dataset Structure",
"### Data Instances\n\nA sample from the training set is provided below:",
"### Data Fields\n\nThe data instances have the following fields:\n\n * image: A PIL.Image.Image object containing the image. Note that when accessing the image column: dataset[0][\"image\"] the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the \"image\" column, i.e. dataset[0][\"image\"] should always be preferred over dataset[\"image\"][0].\n * blemish_label: an int classification label.\n * image_quality_label: an int classification label.\n * orchard_id: an int denoting the source orchard id\n * ccg_id': 2181253,\n * ccg_year: an int for the year the image was taken\n * ccg_week: an int for the week the image was taken\n * crop_id: Aerobotics crop type id\n * cultivar_id': Aerobotics cultivar type id\n * ml_model_version_id: an int representing the fruit detection model id; this will be 'nursery' if it was human annotated\n * cc_id: an int for the id of the camera image\n * camera_type: string for the iPhone camera type\n * s3_path_to_image: string for the path to the source image stored in S3\n * s3_path_to_geojson: string for the path to the source annotation geojson stored in S3\n * annotation_index: an int representing the index of the annotation in the file from the fruit detection outputs\n * annotation_px_area: a float of the size of the annotation in image space\n * img_filename: string image file name\n * annotation_filename': string of the annotation file name\n * annotation_filepath': string of the annotation file path"
] | [
6,
14,
71,
69,
4,
6,
16,
451
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"blemish-binary-2k\"## Dataset summary\n\nThis is a dataset of square images of 224x224 resolution of citrus fruit\nin various contexts. Each images is focused on a single fruit, but contains a\nbuffered region around it. This dataset is currently only contains labels of nominal\nand blemished (properly class 4-8).## Supported Tasks and Leaderboards\n\n * image-classification: The goal of this task is to classify a given fruit in the image into a blemish class. A secondary goal is to classify the image as valid or not. An invalid image is disqualified from evaluation by virtue of lacking clear signal.## Languages\n\nEnglish## Dataset Structure### Data Instances\n\nA sample from the training set is provided below:"
] |
e85d06919841ea4cf1e1b9ee9253d6e6e5bb492a | # Dataset Card for "vi_wiki_sentences_1000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | deokhk/vi_wiki_sentences_1000000 | [
"region:us"
] | 2023-12-13T13:18:55+00:00 | {"dataset_info": {"features": [{"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 140746222, "num_examples": 1000000}, {"name": "dev", "num_bytes": 180584, "num_examples": 1000}], "download_size": 66188079, "dataset_size": 140926806}} | 2023-12-13T13:19:09+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "vi_wiki_sentences_1000000"
More Information needed | [
"# Dataset Card for \"vi_wiki_sentences_1000000\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"vi_wiki_sentences_1000000\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"vi_wiki_sentences_1000000\"\n\nMore Information needed"
] |
72864e33accbb60176e8299e7b90806b8824e516 | # Dataset Card for "vi_wiki_sentences_100000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | deokhk/vi_wiki_sentences_100000 | [
"region:us"
] | 2023-12-13T13:19:13+00:00 | {"dataset_info": {"features": [{"name": "sentence", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 13451997, "num_examples": 100000}, {"name": "dev", "num_bytes": 174026, "num_examples": 1000}], "download_size": 6214856, "dataset_size": 13626023}} | 2023-12-13T13:19:20+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "vi_wiki_sentences_100000"
More Information needed | [
"# Dataset Card for \"vi_wiki_sentences_100000\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"vi_wiki_sentences_100000\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"vi_wiki_sentences_100000\"\n\nMore Information needed"
] |
1de3e0cd61063f7b0b2059e4eaa7cd390e40f7cc |
# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_code_instruct_0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mwitiderrick/open_llama_3b_code_instruct_0.1](https://huggingface.co/mwitiderrick/open_llama_3b_code_instruct_0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mwitiderrick__open_llama_3b_code_instruct_0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T13:18:57.065820](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__open_llama_3b_code_instruct_0.1/blob/main/results_2023-12-13T13-18-57.065820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.28520259561896705,
"acc_stderr": 0.03175383797712848,
"acc_norm": 0.28706024238206684,
"acc_norm_stderr": 0.03253897164366445,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862673,
"mc2": 0.35013884322339384,
"mc2_stderr": 0.013592802427042715
},
"harness|arc:challenge|25": {
"acc": 0.37542662116040953,
"acc_stderr": 0.014150631435111728,
"acc_norm": 0.4121160409556314,
"acc_norm_stderr": 0.014383915302225398
},
"harness|hellaswag|10": {
"acc": 0.4938259310894244,
"acc_stderr": 0.0049894009847222245,
"acc_norm": 0.6695877315275841,
"acc_norm_stderr": 0.004694002781939556
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.040491220417025055,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.040491220417025055
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.29605263157894735,
"acc_stderr": 0.037150621549989056,
"acc_norm": 0.29605263157894735,
"acc_norm_stderr": 0.037150621549989056
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.02761116340239972,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.02761116340239972
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.32947976878612717,
"acc_stderr": 0.03583901754736411,
"acc_norm": 0.32947976878612717,
"acc_norm_stderr": 0.03583901754736411
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292323,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292323
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893624,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893624
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.02271746789770862,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.02271746789770862
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790606,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.02468597928623995,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.02468597928623995
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626301,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626301
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.30569948186528495,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.30569948186528495,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.02626502460827589,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.02626502460827589
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008937,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008937
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27155963302752295,
"acc_stderr": 0.019069098363191456,
"acc_norm": 0.27155963302752295,
"acc_norm_stderr": 0.019069098363191456
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375798,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375798
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145638,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145638
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.27802690582959644,
"acc_stderr": 0.03006958487449405,
"acc_norm": 0.27802690582959644,
"acc_norm_stderr": 0.03006958487449405
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.1984732824427481,
"acc_stderr": 0.03498149385462471,
"acc_norm": 0.1984732824427481,
"acc_norm_stderr": 0.03498149385462471
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3006134969325153,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.3006134969325153,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.040073418097558045,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.040073418097558045
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.016225017944770978,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.016225017944770978
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.02440517393578324,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.02440517393578324
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958157,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958157
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3086816720257235,
"acc_stderr": 0.02623696588115326,
"acc_norm": 0.3086816720257235,
"acc_norm_stderr": 0.02623696588115326
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23049645390070922,
"acc_stderr": 0.025123739226872405,
"acc_norm": 0.23049645390070922,
"acc_norm_stderr": 0.025123739226872405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26597131681877445,
"acc_stderr": 0.011285033165551284,
"acc_norm": 0.26597131681877445,
"acc_norm_stderr": 0.011285033165551284
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596455,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596455
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.017630827375148383,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.017630827375148383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.04309118709946458,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.04309118709946458
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.027682979522960244,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.027682979522960244
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24875621890547264,
"acc_stderr": 0.030567675938916707,
"acc_norm": 0.24875621890547264,
"acc_norm_stderr": 0.030567675938916707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.28654970760233917,
"acc_stderr": 0.03467826685703826,
"acc_norm": 0.28654970760233917,
"acc_norm_stderr": 0.03467826685703826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862673,
"mc2": 0.35013884322339384,
"mc2_stderr": 0.013592802427042715
},
"harness|winogrande|5": {
"acc": 0.654301499605367,
"acc_stderr": 0.013366596951934375
},
"harness|gsm8k|5": {
"acc": 0.018953752843062926,
"acc_stderr": 0.003756078341031478
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_mwitiderrick__open_llama_3b_code_instruct_0.1 | [
"region:us"
] | 2023-12-13T13:21:08+00:00 | {"pretty_name": "Evaluation run of mwitiderrick/open_llama_3b_code_instruct_0.1", "dataset_summary": "Dataset automatically created during the evaluation run of model [mwitiderrick/open_llama_3b_code_instruct_0.1](https://huggingface.co/mwitiderrick/open_llama_3b_code_instruct_0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mwitiderrick__open_llama_3b_code_instruct_0.1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T13:18:57.065820](https://huggingface.co/datasets/open-llm-leaderboard/details_mwitiderrick__open_llama_3b_code_instruct_0.1/blob/main/results_2023-12-13T13-18-57.065820.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.28520259561896705,\n \"acc_stderr\": 0.03175383797712848,\n \"acc_norm\": 0.28706024238206684,\n \"acc_norm_stderr\": 0.03253897164366445,\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862673,\n \"mc2\": 0.35013884322339384,\n \"mc2_stderr\": 0.013592802427042715\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.37542662116040953,\n \"acc_stderr\": 0.014150631435111728,\n \"acc_norm\": 0.4121160409556314,\n \"acc_norm_stderr\": 0.014383915302225398\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4938259310894244,\n \"acc_stderr\": 0.0049894009847222245,\n \"acc_norm\": 0.6695877315275841,\n \"acc_norm_stderr\": 0.004694002781939556\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.040491220417025055,\n \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.040491220417025055\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.29605263157894735,\n \"acc_stderr\": 0.037150621549989056,\n \"acc_norm\": 0.29605263157894735,\n \"acc_norm_stderr\": 0.037150621549989056\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.02761116340239972,\n \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.02761116340239972\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.32947976878612717,\n \"acc_stderr\": 0.03583901754736411,\n \"acc_norm\": 0.32947976878612717,\n \"acc_norm_stderr\": 0.03583901754736411\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292323,\n \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292323\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.041424397194893624,\n \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.041424397194893624\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.26455026455026454,\n \"acc_stderr\": 0.02271746789770862,\n \"acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.02271746789770862\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.03764950879790606,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.03764950879790606\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n \"acc_stderr\": 0.02468597928623995,\n \"acc_norm\": 0.25161290322580643,\n \"acc_norm_stderr\": 0.02468597928623995\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626301,\n \"acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626301\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.30569948186528495,\n \"acc_stderr\": 0.033248379397581594,\n \"acc_norm\": 0.30569948186528495,\n \"acc_norm_stderr\": 0.033248379397581594\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.023901157979402538,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.023901157979402538\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.02626502460827589,\n \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.02626502460827589\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008937,\n \"acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008937\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.27155963302752295,\n \"acc_stderr\": 0.019069098363191456,\n \"acc_norm\": 0.27155963302752295,\n \"acc_norm_stderr\": 0.019069098363191456\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375798,\n \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375798\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145638,\n \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145638\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598025,\n \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598025\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n \"acc_stderr\": 0.03006958487449405,\n \"acc_norm\": 0.27802690582959644,\n \"acc_norm_stderr\": 0.03006958487449405\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.1984732824427481,\n \"acc_stderr\": 0.03498149385462471,\n \"acc_norm\": 0.1984732824427481,\n \"acc_norm_stderr\": 0.03498149385462471\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292534,\n \"acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292534\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.3006134969325153,\n \"acc_stderr\": 0.03602511318806771,\n \"acc_norm\": 0.3006134969325153,\n \"acc_norm_stderr\": 0.03602511318806771\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n \"acc_stderr\": 0.040073418097558045,\n \"acc_norm\": 0.23214285714285715,\n \"acc_norm_stderr\": 0.040073418097558045\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n \"acc_stderr\": 0.016225017944770978,\n \"acc_norm\": 0.28991060025542786,\n \"acc_norm_stderr\": 0.016225017944770978\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.28901734104046245,\n \"acc_stderr\": 0.02440517393578324,\n \"acc_norm\": 0.28901734104046245,\n \"acc_norm_stderr\": 0.02440517393578324\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958157,\n \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958157\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3086816720257235,\n \"acc_stderr\": 0.02623696588115326,\n \"acc_norm\": 0.3086816720257235,\n \"acc_norm_stderr\": 0.02623696588115326\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.025329888171900926,\n \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.025329888171900926\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23049645390070922,\n \"acc_stderr\": 0.025123739226872405,\n \"acc_norm\": 0.23049645390070922,\n \"acc_norm_stderr\": 0.025123739226872405\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26597131681877445,\n \"acc_stderr\": 0.011285033165551284,\n \"acc_norm\": 0.26597131681877445,\n \"acc_norm_stderr\": 0.011285033165551284\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596455,\n \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596455\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.017630827375148383,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.017630827375148383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n \"acc_stderr\": 0.04309118709946458,\n \"acc_norm\": 0.2818181818181818,\n \"acc_norm_stderr\": 0.04309118709946458\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.027682979522960244,\n \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.027682979522960244\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24875621890547264,\n \"acc_stderr\": 0.030567675938916707,\n \"acc_norm\": 0.24875621890547264,\n \"acc_norm_stderr\": 0.030567675938916707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.28654970760233917,\n \"acc_stderr\": 0.03467826685703826,\n \"acc_norm\": 0.28654970760233917,\n \"acc_norm_stderr\": 0.03467826685703826\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n \"mc1_stderr\": 0.014761945174862673,\n \"mc2\": 0.35013884322339384,\n \"mc2_stderr\": 0.013592802427042715\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.654301499605367,\n \"acc_stderr\": 0.013366596951934375\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.018953752843062926,\n \"acc_stderr\": 0.003756078341031478\n }\n}\n```", "repo_url": "https://huggingface.co/mwitiderrick/open_llama_3b_code_instruct_0.1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|arc:challenge|25_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|gsm8k|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hellaswag|10_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T13-18-57.065820.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["**/details_harness|winogrande|5_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T13-18-57.065820.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T13_18_57.065820", "path": ["results_2023-12-13T13-18-57.065820.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T13-18-57.065820.parquet"]}]}]} | 2023-12-13T13:21:51+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_code_instruct_0.1
Dataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_code_instruct_0.1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T13:18:57.065820(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_code_instruct_0.1\n\n\n\nDataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_code_instruct_0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T13:18:57.065820(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_code_instruct_0.1\n\n\n\nDataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_code_instruct_0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T13:18:57.065820(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
199,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of mwitiderrick/open_llama_3b_code_instruct_0.1\n\n\n\nDataset automatically created during the evaluation run of model mwitiderrick/open_llama_3b_code_instruct_0.1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T13:18:57.065820(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
f971f16631fef43994873aa1d7677d615f64c8f9 |
# Dataset Card for Evaluation run of vihangd/dopeyshearedplats-1.3b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [vihangd/dopeyshearedplats-1.3b-v1](https://huggingface.co/vihangd/dopeyshearedplats-1.3b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_vihangd__dopeyshearedplats-1.3b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T13:37:34.130815](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__dopeyshearedplats-1.3b-v1/blob/main/results_2023-12-13T13-37-34.130815.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26012302704770085,
"acc_stderr": 0.030820336255728206,
"acc_norm": 0.2621303940455793,
"acc_norm_stderr": 0.031589269063273896,
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.3821066604136214,
"mc2_stderr": 0.015269097668070952
},
"harness|arc:challenge|25": {
"acc": 0.3225255972696246,
"acc_stderr": 0.013659980894277368,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.013880644570156215
},
"harness|hellaswag|10": {
"acc": 0.4848635729934276,
"acc_stderr": 0.004987494455523719,
"acc_norm": 0.6430989842660825,
"acc_norm_stderr": 0.004781061390873926
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2,
"acc_stderr": 0.034554737023254394,
"acc_norm": 0.2,
"acc_norm_stderr": 0.034554737023254394
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.25660377358490566,
"acc_stderr": 0.026880647889051958,
"acc_norm": 0.25660377358490566,
"acc_norm_stderr": 0.026880647889051958
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624576,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624576
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483099,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483099
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.03097669299853443,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.03097669299853443
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.21379310344827587,
"acc_stderr": 0.03416520447747549,
"acc_norm": 0.21379310344827587,
"acc_norm_stderr": 0.03416520447747549
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27419354838709675,
"acc_stderr": 0.025378139970885196,
"acc_norm": 0.27419354838709675,
"acc_norm_stderr": 0.025378139970885196
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114475,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114475
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.28484848484848485,
"acc_stderr": 0.03524390844511784,
"acc_norm": 0.28484848484848485,
"acc_norm_stderr": 0.03524390844511784
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.031911782267135466,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.031911782267135466
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.25906735751295334,
"acc_stderr": 0.03161877917935409,
"acc_norm": 0.25906735751295334,
"acc_norm_stderr": 0.03161877917935409
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.023507579020645333,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.023507579020645333
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844082,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279483,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279483
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.25871559633027524,
"acc_stderr": 0.01877605231961962,
"acc_norm": 0.25871559633027524,
"acc_norm_stderr": 0.01877605231961962
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39351851851851855,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.39351851851851855,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.03198001660115071,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.03198001660115071
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.21518987341772153,
"acc_stderr": 0.026750826994676166,
"acc_norm": 0.21518987341772153,
"acc_norm_stderr": 0.026750826994676166
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847834,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847834
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004257,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004257
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653696,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653696
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2669220945083014,
"acc_stderr": 0.015818450894777573,
"acc_norm": 0.2669220945083014,
"acc_norm_stderr": 0.015818450894777573
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098407,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098407
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20261437908496732,
"acc_stderr": 0.023015446877985672,
"acc_norm": 0.20261437908496732,
"acc_norm_stderr": 0.023015446877985672
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410612,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967287,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967287
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.28368794326241137,
"acc_stderr": 0.02689170942834396,
"acc_norm": 0.28368794326241137,
"acc_norm_stderr": 0.02689170942834396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24967405475880053,
"acc_stderr": 0.011054538377832327,
"acc_norm": 0.24967405475880053,
"acc_norm_stderr": 0.011054538377832327
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.16176470588235295,
"acc_stderr": 0.022368672562886754,
"acc_norm": 0.16176470588235295,
"acc_norm_stderr": 0.022368672562886754
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322284,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322284
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2,
"acc_stderr": 0.03831305140884601,
"acc_norm": 0.2,
"acc_norm_stderr": 0.03831305140884601
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2693877551020408,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.2693877551020408,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21637426900584794,
"acc_stderr": 0.03158149539338735,
"acc_norm": 0.21637426900584794,
"acc_norm_stderr": 0.03158149539338735
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2460220318237454,
"mc1_stderr": 0.01507721920066259,
"mc2": 0.3821066604136214,
"mc2_stderr": 0.015269097668070952
},
"harness|winogrande|5": {
"acc": 0.5737963693764798,
"acc_stderr": 0.013898585965412338
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.002389281512077212
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_vihangd__dopeyshearedplats-1.3b-v1 | [
"region:us"
] | 2023-12-13T13:40:30+00:00 | {"pretty_name": "Evaluation run of vihangd/dopeyshearedplats-1.3b-v1", "dataset_summary": "Dataset automatically created during the evaluation run of model [vihangd/dopeyshearedplats-1.3b-v1](https://huggingface.co/vihangd/dopeyshearedplats-1.3b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_vihangd__dopeyshearedplats-1.3b-v1\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T13:37:34.130815](https://huggingface.co/datasets/open-llm-leaderboard/details_vihangd__dopeyshearedplats-1.3b-v1/blob/main/results_2023-12-13T13-37-34.130815.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26012302704770085,\n \"acc_stderr\": 0.030820336255728206,\n \"acc_norm\": 0.2621303940455793,\n \"acc_norm_stderr\": 0.031589269063273896,\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.3821066604136214,\n \"mc2_stderr\": 0.015269097668070952\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.3225255972696246,\n \"acc_stderr\": 0.013659980894277368,\n \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.013880644570156215\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4848635729934276,\n \"acc_stderr\": 0.004987494455523719,\n \"acc_norm\": 0.6430989842660825,\n \"acc_norm_stderr\": 0.004781061390873926\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.034554737023254394,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.034554737023254394\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.25660377358490566,\n \"acc_stderr\": 0.026880647889051958,\n \"acc_norm\": 0.25660377358490566,\n \"acc_norm_stderr\": 0.026880647889051958\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.037455547914624576,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.037455547914624576\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n \"acc_stderr\": 0.03242414757483099,\n \"acc_norm\": 0.23699421965317918,\n \"acc_norm_stderr\": 0.03242414757483099\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177,\n \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816508,\n \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816508\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3404255319148936,\n \"acc_stderr\": 0.03097669299853443,\n \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.03097669299853443\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.21379310344827587,\n \"acc_stderr\": 0.03416520447747549,\n \"acc_norm\": 0.21379310344827587,\n \"acc_norm_stderr\": 0.03416520447747549\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27419354838709675,\n \"acc_stderr\": 0.025378139970885196,\n \"acc_norm\": 0.27419354838709675,\n \"acc_norm_stderr\": 0.025378139970885196\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114475,\n \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114475\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.28484848484848485,\n \"acc_stderr\": 0.03524390844511784,\n \"acc_norm\": 0.28484848484848485,\n \"acc_norm_stderr\": 0.03524390844511784\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.031911782267135466,\n \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.031911782267135466\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.25906735751295334,\n \"acc_stderr\": 0.03161877917935409,\n \"acc_norm\": 0.25906735751295334,\n \"acc_norm_stderr\": 0.03161877917935409\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.023507579020645333,\n \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.023507579020645333\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844082,\n \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844082\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279483,\n \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279483\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.25871559633027524,\n \"acc_stderr\": 0.01877605231961962,\n \"acc_norm\": 0.25871559633027524,\n \"acc_norm_stderr\": 0.01877605231961962\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.39351851851851855,\n \"acc_stderr\": 0.03331747876370312,\n \"acc_norm\": 0.39351851851851855,\n \"acc_norm_stderr\": 0.03331747876370312\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.03198001660115071,\n \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.03198001660115071\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.21518987341772153,\n \"acc_stderr\": 0.026750826994676166,\n \"acc_norm\": 0.21518987341772153,\n \"acc_norm_stderr\": 0.026750826994676166\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.34080717488789236,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847834,\n \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847834\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.03259177392742178,\n \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.03259177392742178\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.028605953702004257,\n \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.028605953702004257\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.03861229196653696,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.03861229196653696\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2669220945083014,\n \"acc_stderr\": 0.015818450894777573,\n \"acc_norm\": 0.2669220945083014,\n \"acc_norm_stderr\": 0.015818450894777573\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.022183477668412856,\n \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.022183477668412856\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n \"acc_stderr\": 0.014378169884098407,\n \"acc_norm\": 0.2446927374301676,\n \"acc_norm_stderr\": 0.014378169884098407\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.20261437908496732,\n \"acc_stderr\": 0.023015446877985672,\n \"acc_norm\": 0.20261437908496732,\n \"acc_norm_stderr\": 0.023015446877985672\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n \"acc_stderr\": 0.025218040373410612,\n \"acc_norm\": 0.27009646302250806,\n \"acc_norm_stderr\": 0.025218040373410612\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967287,\n \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967287\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.28368794326241137,\n \"acc_stderr\": 0.02689170942834396,\n \"acc_norm\": 0.28368794326241137,\n \"acc_norm_stderr\": 0.02689170942834396\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24967405475880053,\n \"acc_stderr\": 0.011054538377832327,\n \"acc_norm\": 0.24967405475880053,\n \"acc_norm_stderr\": 0.011054538377832327\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.16176470588235295,\n \"acc_stderr\": 0.022368672562886754,\n \"acc_norm\": 0.16176470588235295,\n \"acc_norm_stderr\": 0.022368672562886754\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322284,\n \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322284\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.03831305140884601,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.03831305140884601\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.2693877551020408,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.2693877551020408,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.23880597014925373,\n \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.21637426900584794,\n \"acc_stderr\": 0.03158149539338735,\n \"acc_norm\": 0.21637426900584794,\n \"acc_norm_stderr\": 0.03158149539338735\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2460220318237454,\n \"mc1_stderr\": 0.01507721920066259,\n \"mc2\": 0.3821066604136214,\n \"mc2_stderr\": 0.015269097668070952\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5737963693764798,\n \"acc_stderr\": 0.013898585965412338\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \"acc_stderr\": 0.002389281512077212\n }\n}\n```", "repo_url": "https://huggingface.co/vihangd/dopeyshearedplats-1.3b-v1", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|arc:challenge|25_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|gsm8k|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hellaswag|10_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T13-37-34.130815.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["**/details_harness|winogrande|5_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T13-37-34.130815.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T13_37_34.130815", "path": ["results_2023-12-13T13-37-34.130815.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T13-37-34.130815.parquet"]}]}]} | 2023-12-13T13:41:12+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of vihangd/dopeyshearedplats-1.3b-v1
Dataset automatically created during the evaluation run of model vihangd/dopeyshearedplats-1.3b-v1 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T13:37:34.130815(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of vihangd/dopeyshearedplats-1.3b-v1\n\n\n\nDataset automatically created during the evaluation run of model vihangd/dopeyshearedplats-1.3b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T13:37:34.130815(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of vihangd/dopeyshearedplats-1.3b-v1\n\n\n\nDataset automatically created during the evaluation run of model vihangd/dopeyshearedplats-1.3b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T13:37:34.130815(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of vihangd/dopeyshearedplats-1.3b-v1\n\n\n\nDataset automatically created during the evaluation run of model vihangd/dopeyshearedplats-1.3b-v1 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T13:37:34.130815(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
0363d16d49cb3af68a0a512404d81501515dadc4 |
# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [v1olet/v1olet_merged_dpo_7B_v4](https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T13:46:12.224585](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v4/blob/main/results_2023-12-13T13-46-12.224585.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5917202264245718,
"acc_stderr": 0.03324717259397107,
"acc_norm": 0.5957734427293545,
"acc_norm_stderr": 0.0339416190415928,
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.5943157054555347,
"mc2_stderr": 0.01604355026591654
},
"harness|arc:challenge|25": {
"acc": 0.6467576791808873,
"acc_stderr": 0.013967822714840055,
"acc_norm": 0.6697952218430034,
"acc_norm_stderr": 0.013743085603760426
},
"harness|hellaswag|10": {
"acc": 0.6450906193985262,
"acc_stderr": 0.004775079636567097,
"acc_norm": 0.8408683529177454,
"acc_norm_stderr": 0.003650512158306275
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.040166600304512336,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.040166600304512336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201942,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201942
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057093,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7354838709677419,
"acc_stderr": 0.02509189237885928,
"acc_norm": 0.7354838709677419,
"acc_norm_stderr": 0.02509189237885928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.844559585492228,
"acc_stderr": 0.026148483469153303,
"acc_norm": 0.844559585492228,
"acc_norm_stderr": 0.026148483469153303
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6092436974789915,
"acc_stderr": 0.03169380235712996,
"acc_norm": 0.6092436974789915,
"acc_norm_stderr": 0.03169380235712996
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7688073394495413,
"acc_stderr": 0.018075750241633146,
"acc_norm": 0.7688073394495413,
"acc_norm_stderr": 0.018075750241633146
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.03019028245350195,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.03019028245350195
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.03714908409935573,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.03714908409935573
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937148,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937148
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.45251396648044695,
"acc_stderr": 0.016646914804438775,
"acc_norm": 0.45251396648044695,
"acc_norm_stderr": 0.016646914804438775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.027582811415159614,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.027582811415159614
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001862,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001862
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543448,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543448
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41916558018252936,
"acc_stderr": 0.012602244505788236,
"acc_norm": 0.41916558018252936,
"acc_norm_stderr": 0.012602244505788236
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.0198351764843754,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.0198351764843754
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6285714285714286,
"acc_stderr": 0.030932858792789855,
"acc_norm": 0.6285714285714286,
"acc_norm_stderr": 0.030932858792789855
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169146,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169146
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4614443084455324,
"mc1_stderr": 0.017451384104637455,
"mc2": 0.5943157054555347,
"mc2_stderr": 0.01604355026591654
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989245
},
"harness|gsm8k|5": {
"acc": 0.3525398028809704,
"acc_stderr": 0.013159909755930323
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v4 | [
"region:us"
] | 2023-12-13T13:49:02+00:00 | {"pretty_name": "Evaluation run of v1olet/v1olet_merged_dpo_7B_v4", "dataset_summary": "Dataset automatically created during the evaluation run of model [v1olet/v1olet_merged_dpo_7B_v4](https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v4\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T13:46:12.224585](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v4/blob/main/results_2023-12-13T13-46-12.224585.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5917202264245718,\n \"acc_stderr\": 0.03324717259397107,\n \"acc_norm\": 0.5957734427293545,\n \"acc_norm_stderr\": 0.0339416190415928,\n \"mc1\": 0.4614443084455324,\n \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.5943157054555347,\n \"mc2_stderr\": 0.01604355026591654\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840055,\n \"acc_norm\": 0.6697952218430034,\n \"acc_norm_stderr\": 0.013743085603760426\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6450906193985262,\n \"acc_stderr\": 0.004775079636567097,\n \"acc_norm\": 0.8408683529177454,\n \"acc_norm_stderr\": 0.003650512158306275\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.040166600304512336,\n \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.040166600304512336\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201942,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201942\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.03265019475033582,\n \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.03265019475033582\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7354838709677419,\n \"acc_stderr\": 0.02509189237885928,\n \"acc_norm\": 0.7354838709677419,\n \"acc_norm_stderr\": 0.02509189237885928\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7626262626262627,\n \"acc_stderr\": 0.0303137105381989,\n \"acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.0303137105381989\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.844559585492228,\n \"acc_stderr\": 0.026148483469153303,\n \"acc_norm\": 0.844559585492228,\n \"acc_norm_stderr\": 0.026148483469153303\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6092436974789915,\n \"acc_stderr\": 0.03169380235712996,\n \"acc_norm\": 0.6092436974789915,\n \"acc_norm_stderr\": 0.03169380235712996\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7688073394495413,\n \"acc_stderr\": 0.018075750241633146,\n \"acc_norm\": 0.7688073394495413,\n \"acc_norm_stderr\": 0.018075750241633146\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.03019028245350195,\n \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.03019028245350195\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.03714908409935573,\n \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.03714908409935573\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n \"acc_stderr\": 0.014927447101937148,\n \"acc_norm\": 0.7752234993614304,\n \"acc_norm_stderr\": 0.014927447101937148\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.45251396648044695,\n \"acc_stderr\": 0.016646914804438775,\n \"acc_norm\": 0.45251396648044695,\n \"acc_norm_stderr\": 0.016646914804438775\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6339869281045751,\n \"acc_stderr\": 0.027582811415159614,\n \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.027582811415159614\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.654320987654321,\n \"acc_stderr\": 0.026462487777001862,\n \"acc_norm\": 0.654320987654321,\n \"acc_norm_stderr\": 0.026462487777001862\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543448,\n \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543448\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41916558018252936,\n \"acc_stderr\": 0.012602244505788236,\n \"acc_norm\": 0.41916558018252936,\n \"acc_norm_stderr\": 0.012602244505788236\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.0198351764843754,\n \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.0198351764843754\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6285714285714286,\n \"acc_stderr\": 0.030932858792789855,\n \"acc_norm\": 0.6285714285714286,\n \"acc_norm_stderr\": 0.030932858792789855\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.025870646766169146,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.025870646766169146\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4614443084455324,\n \"mc1_stderr\": 0.017451384104637455,\n \"mc2\": 0.5943157054555347,\n \"mc2_stderr\": 0.01604355026591654\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989245\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3525398028809704,\n \"acc_stderr\": 0.013159909755930323\n }\n}\n```", "repo_url": "https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v4", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|arc:challenge|25_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|gsm8k|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hellaswag|10_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T13-46-12.224585.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["**/details_harness|winogrande|5_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T13-46-12.224585.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T13_46_12.224585", "path": ["results_2023-12-13T13-46-12.224585.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T13-46-12.224585.parquet"]}]}]} | 2023-12-13T13:49:44+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v4
Dataset automatically created during the evaluation run of model v1olet/v1olet_merged_dpo_7B_v4 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T13:46:12.224585(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v4\n\n\n\nDataset automatically created during the evaluation run of model v1olet/v1olet_merged_dpo_7B_v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T13:46:12.224585(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v4\n\n\n\nDataset automatically created during the evaluation run of model v1olet/v1olet_merged_dpo_7B_v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T13:46:12.224585(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v4\n\n\n\nDataset automatically created during the evaluation run of model v1olet/v1olet_merged_dpo_7B_v4 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T13:46:12.224585(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
6200a5fb7c53a88d8f07243a551d16764595ae67 |
# Dataset Card for Evaluation run of Weyaxi/Instruct-v0.2-Seraph-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Weyaxi/Instruct-v0.2-Seraph-7B](https://huggingface.co/Weyaxi/Instruct-v0.2-Seraph-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Weyaxi__Instruct-v0.2-Seraph-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T13:51:56.485977](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Instruct-v0.2-Seraph-7B/blob/main/results_2023-12-13T13-51-56.485977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.630546295773675,
"acc_stderr": 0.03269222189548608,
"acc_norm": 0.6329595358982607,
"acc_norm_stderr": 0.03335128778096069,
"mc1": 0.4810281517747858,
"mc1_stderr": 0.01749089640576235,
"mc2": 0.6539318892450207,
"mc2_stderr": 0.015152914709562705
},
"harness|arc:challenge|25": {
"acc": 0.6117747440273038,
"acc_stderr": 0.014241614207414046,
"acc_norm": 0.6476109215017065,
"acc_norm_stderr": 0.013960142600598672
},
"harness|hellaswag|10": {
"acc": 0.6605257916749652,
"acc_stderr": 0.004725630911520329,
"acc_norm": 0.8419637522405895,
"acc_norm_stderr": 0.003640294912838693
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944433,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229865,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229865
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6282051282051282,
"acc_stderr": 0.024503472557110936,
"acc_norm": 0.6282051282051282,
"acc_norm_stderr": 0.024503472557110936
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849316,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849316
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381401,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381401
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4849162011173184,
"acc_stderr": 0.016714890379996062,
"acc_norm": 0.4849162011173184,
"acc_norm_stderr": 0.016714890379996062
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464482,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.019412539242032165,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.019412539242032165
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.02796267760476891,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.02796267760476891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4810281517747858,
"mc1_stderr": 0.01749089640576235,
"mc2": 0.6539318892450207,
"mc2_stderr": 0.015152914709562705
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987736
},
"harness|gsm8k|5": {
"acc": 0.5443517816527672,
"acc_stderr": 0.013718194542485601
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Weyaxi__Instruct-v0.2-Seraph-7B | [
"region:us"
] | 2023-12-13T13:54:46+00:00 | {"pretty_name": "Evaluation run of Weyaxi/Instruct-v0.2-Seraph-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Weyaxi/Instruct-v0.2-Seraph-7B](https://huggingface.co/Weyaxi/Instruct-v0.2-Seraph-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Weyaxi__Instruct-v0.2-Seraph-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T13:51:56.485977](https://huggingface.co/datasets/open-llm-leaderboard/details_Weyaxi__Instruct-v0.2-Seraph-7B/blob/main/results_2023-12-13T13-51-56.485977.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.630546295773675,\n \"acc_stderr\": 0.03269222189548608,\n \"acc_norm\": 0.6329595358982607,\n \"acc_norm_stderr\": 0.03335128778096069,\n \"mc1\": 0.4810281517747858,\n \"mc1_stderr\": 0.01749089640576235,\n \"mc2\": 0.6539318892450207,\n \"mc2_stderr\": 0.015152914709562705\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6117747440273038,\n \"acc_stderr\": 0.014241614207414046,\n \"acc_norm\": 0.6476109215017065,\n \"acc_norm_stderr\": 0.013960142600598672\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6605257916749652,\n \"acc_stderr\": 0.004725630911520329,\n \"acc_norm\": 0.8419637522405895,\n \"acc_norm_stderr\": 0.003640294912838693\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944433,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229865,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229865\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6282051282051282,\n \"acc_stderr\": 0.024503472557110936,\n \"acc_norm\": 0.6282051282051282,\n \"acc_norm_stderr\": 0.024503472557110936\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977748,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977748\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849316,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849316\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n \"acc_stderr\": 0.014036945850381401,\n \"acc_norm\": 0.80970625798212,\n \"acc_norm_stderr\": 0.014036945850381401\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4849162011173184,\n \"acc_stderr\": 0.016714890379996062,\n \"acc_norm\": 0.4849162011173184,\n \"acc_norm_stderr\": 0.016714890379996062\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.019412539242032165,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.019412539242032165\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n \"acc_stderr\": 0.02796267760476891,\n \"acc_norm\": 0.8059701492537313,\n \"acc_norm_stderr\": 0.02796267760476891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4810281517747858,\n \"mc1_stderr\": 0.01749089640576235,\n \"mc2\": 0.6539318892450207,\n \"mc2_stderr\": 0.015152914709562705\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987736\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5443517816527672,\n \"acc_stderr\": 0.013718194542485601\n }\n}\n```", "repo_url": "https://huggingface.co/Weyaxi/Instruct-v0.2-Seraph-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|arc:challenge|25_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|gsm8k|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hellaswag|10_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T13-51-56.485977.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["**/details_harness|winogrande|5_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T13-51-56.485977.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T13_51_56.485977", "path": ["results_2023-12-13T13-51-56.485977.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T13-51-56.485977.parquet"]}]}]} | 2023-12-13T13:55:29+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Weyaxi/Instruct-v0.2-Seraph-7B
Dataset automatically created during the evaluation run of model Weyaxi/Instruct-v0.2-Seraph-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T13:51:56.485977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Weyaxi/Instruct-v0.2-Seraph-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Instruct-v0.2-Seraph-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T13:51:56.485977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Weyaxi/Instruct-v0.2-Seraph-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Instruct-v0.2-Seraph-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T13:51:56.485977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Weyaxi/Instruct-v0.2-Seraph-7B\n\n\n\nDataset automatically created during the evaluation run of model Weyaxi/Instruct-v0.2-Seraph-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T13:51:56.485977(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
33835897e4312be2c281ed21fad38d7af47f3ee6 |
# Dataset of fang (Arknights)
This is the dataset of fang (Arknights), containing 61 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 61 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 146 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 159 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 61 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 61 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 61 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 146 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 146 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 84 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 159 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 159 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| AppleHarem/fang_arknights | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-12-13T13:58:03+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-13T18:43:56+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of fang (Arknights)
===========================
This is the dataset of fang (Arknights), containing 61 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
This is a WebUI contains crawlers and other thing: (LittleAppleWebUI)
| [] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] | [
44
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
bd876e6fc7372d4f6f2f307d330b396042b14c8d |
# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad](https://huggingface.co/abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama2-chat-5000-v1.0-squad",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T13:57:32.588112](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama2-chat-5000-v1.0-squad/blob/main/results_2023-12-13T13-57-32.588112.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46553834248291337,
"acc_stderr": 0.034519983038704787,
"acc_norm": 0.4702267174194307,
"acc_norm_stderr": 0.035292720953402384,
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502353,
"mc2": 0.4445304155126561,
"mc2_stderr": 0.015320779641152712
},
"harness|arc:challenge|25": {
"acc": 0.4735494880546075,
"acc_stderr": 0.014590931358120172,
"acc_norm": 0.5093856655290102,
"acc_norm_stderr": 0.014608816322065003
},
"harness|hellaswag|10": {
"acc": 0.5750846444931289,
"acc_stderr": 0.004933198776700262,
"acc_norm": 0.7660824536944831,
"acc_norm_stderr": 0.004224552134436874
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.040685900502249704,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.040685900502249704
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5208333333333334,
"acc_stderr": 0.041775789507399935,
"acc_norm": 0.5208333333333334,
"acc_norm_stderr": 0.041775789507399935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3699421965317919,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.3699421965317919,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.042801058373643966,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.042801058373643966
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3829787234042553,
"acc_stderr": 0.03177821250236922,
"acc_norm": 0.3829787234042553,
"acc_norm_stderr": 0.03177821250236922
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.04489539350270701,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.04489539350270701
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23015873015873015,
"acc_stderr": 0.03764950879790605,
"acc_norm": 0.23015873015873015,
"acc_norm_stderr": 0.03764950879790605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5419354838709678,
"acc_stderr": 0.028343787250540625,
"acc_norm": 0.5419354838709678,
"acc_norm_stderr": 0.028343787250540625
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3645320197044335,
"acc_stderr": 0.033864057460620905,
"acc_norm": 0.3645320197044335,
"acc_norm_stderr": 0.033864057460620905
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5858585858585859,
"acc_stderr": 0.03509438348879629,
"acc_norm": 0.5858585858585859,
"acc_norm_stderr": 0.03509438348879629
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.694300518134715,
"acc_stderr": 0.033248379397581594,
"acc_norm": 0.694300518134715,
"acc_norm_stderr": 0.033248379397581594
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4282051282051282,
"acc_stderr": 0.025088301454694838,
"acc_norm": 0.4282051282051282,
"acc_norm_stderr": 0.025088301454694838
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871934,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871934
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.031429466378837076,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.031429466378837076
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6513761467889908,
"acc_stderr": 0.02043125409071432,
"acc_norm": 0.6513761467889908,
"acc_norm_stderr": 0.02043125409071432
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.03214952147802749,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.03214952147802749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03454236585380608,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03454236585380608
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.569620253164557,
"acc_stderr": 0.032230171959375976,
"acc_norm": 0.569620253164557,
"acc_norm_stderr": 0.032230171959375976
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5560538116591929,
"acc_stderr": 0.03334625674242728,
"acc_norm": 0.5560538116591929,
"acc_norm_stderr": 0.03334625674242728
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5267175572519084,
"acc_stderr": 0.04379024936553894,
"acc_norm": 0.5267175572519084,
"acc_norm_stderr": 0.04379024936553894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6837606837606838,
"acc_stderr": 0.030463656747340254,
"acc_norm": 0.6837606837606838,
"acc_norm_stderr": 0.030463656747340254
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6577266922094508,
"acc_stderr": 0.016967031766413617,
"acc_norm": 0.6577266922094508,
"acc_norm_stderr": 0.016967031766413617
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.523121387283237,
"acc_stderr": 0.026890297881303118,
"acc_norm": 0.523121387283237,
"acc_norm_stderr": 0.026890297881303118
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.01424263007057489,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.01424263007057489
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.028607893699576066,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.028607893699576066
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5434083601286174,
"acc_stderr": 0.028290869054197604,
"acc_norm": 0.5434083601286174,
"acc_norm_stderr": 0.028290869054197604
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5493827160493827,
"acc_stderr": 0.027684721415656192,
"acc_norm": 0.5493827160493827,
"acc_norm_stderr": 0.027684721415656192
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.027889139300534785,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.027889139300534785
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.32790091264667537,
"acc_stderr": 0.011989936640666523,
"acc_norm": 0.32790091264667537,
"acc_norm_stderr": 0.011989936640666523
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.45751633986928103,
"acc_stderr": 0.020154685712590895,
"acc_norm": 0.45751633986928103,
"acc_norm_stderr": 0.020154685712590895
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794915,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794915
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46530612244897956,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.46530612244897956,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5671641791044776,
"acc_stderr": 0.03503490923673282,
"acc_norm": 0.5671641791044776,
"acc_norm_stderr": 0.03503490923673282
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.035282112582452306,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.035282112582452306
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2864137086903305,
"mc1_stderr": 0.015826142439502353,
"mc2": 0.4445304155126561,
"mc2_stderr": 0.015320779641152712
},
"harness|winogrande|5": {
"acc": 0.7198105761641673,
"acc_stderr": 0.012621707979798497
},
"harness|gsm8k|5": {
"acc": 0.16148597422289612,
"acc_stderr": 0.010135959452134323
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama2-chat-5000-v1.0-squad | [
"region:us"
] | 2023-12-13T14:00:26+00:00 | {"pretty_name": "Evaluation run of abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad", "dataset_summary": "Dataset automatically created during the evaluation run of model [abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad](https://huggingface.co/abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama2-chat-5000-v1.0-squad\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T13:57:32.588112](https://huggingface.co/datasets/open-llm-leaderboard/details_abdulrahman-nuzha__finetuned-llama2-chat-5000-v1.0-squad/blob/main/results_2023-12-13T13-57-32.588112.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46553834248291337,\n \"acc_stderr\": 0.034519983038704787,\n \"acc_norm\": 0.4702267174194307,\n \"acc_norm_stderr\": 0.035292720953402384,\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502353,\n \"mc2\": 0.4445304155126561,\n \"mc2_stderr\": 0.015320779641152712\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.4735494880546075,\n \"acc_stderr\": 0.014590931358120172,\n \"acc_norm\": 0.5093856655290102,\n \"acc_norm_stderr\": 0.014608816322065003\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5750846444931289,\n \"acc_stderr\": 0.004933198776700262,\n \"acc_norm\": 0.7660824536944831,\n \"acc_norm_stderr\": 0.004224552134436874\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.040685900502249704,\n \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.040685900502249704\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5208333333333334,\n \"acc_stderr\": 0.041775789507399935,\n \"acc_norm\": 0.5208333333333334,\n \"acc_norm_stderr\": 0.041775789507399935\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3699421965317919,\n \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.3699421965317919,\n \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.042801058373643966,\n \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.042801058373643966\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.3829787234042553,\n \"acc_stderr\": 0.03177821250236922,\n \"acc_norm\": 0.3829787234042553,\n \"acc_norm_stderr\": 0.03177821250236922\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.04489539350270701,\n \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.04489539350270701\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.04144311810878151,\n \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.04144311810878151\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.023919984164047732,\n \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.023919984164047732\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23015873015873015,\n \"acc_stderr\": 0.03764950879790605,\n \"acc_norm\": 0.23015873015873015,\n \"acc_norm_stderr\": 0.03764950879790605\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5419354838709678,\n \"acc_stderr\": 0.028343787250540625,\n \"acc_norm\": 0.5419354838709678,\n \"acc_norm_stderr\": 0.028343787250540625\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3645320197044335,\n \"acc_stderr\": 0.033864057460620905,\n \"acc_norm\": 0.3645320197044335,\n \"acc_norm_stderr\": 0.033864057460620905\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.5858585858585859,\n \"acc_stderr\": 0.03509438348879629,\n \"acc_norm\": 0.5858585858585859,\n \"acc_norm_stderr\": 0.03509438348879629\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.694300518134715,\n \"acc_stderr\": 0.033248379397581594,\n \"acc_norm\": 0.694300518134715,\n \"acc_norm_stderr\": 0.033248379397581594\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.4282051282051282,\n \"acc_stderr\": 0.025088301454694838,\n \"acc_norm\": 0.4282051282051282,\n \"acc_norm_stderr\": 0.025088301454694838\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871934,\n \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871934\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.031429466378837076,\n \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.031429466378837076\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6513761467889908,\n \"acc_stderr\": 0.02043125409071432,\n \"acc_norm\": 0.6513761467889908,\n \"acc_norm_stderr\": 0.02043125409071432\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.03214952147802749,\n \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.03214952147802749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03454236585380608,\n \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03454236585380608\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.569620253164557,\n \"acc_stderr\": 0.032230171959375976,\n \"acc_norm\": 0.569620253164557,\n \"acc_norm_stderr\": 0.032230171959375976\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5560538116591929,\n \"acc_stderr\": 0.03334625674242728,\n \"acc_norm\": 0.5560538116591929,\n \"acc_norm_stderr\": 0.03334625674242728\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5267175572519084,\n \"acc_stderr\": 0.04379024936553894,\n \"acc_norm\": 0.5267175572519084,\n \"acc_norm_stderr\": 0.04379024936553894\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6837606837606838,\n \"acc_stderr\": 0.030463656747340254,\n \"acc_norm\": 0.6837606837606838,\n \"acc_norm_stderr\": 0.030463656747340254\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6577266922094508,\n \"acc_stderr\": 0.016967031766413617,\n \"acc_norm\": 0.6577266922094508,\n \"acc_norm_stderr\": 0.016967031766413617\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.523121387283237,\n \"acc_stderr\": 0.026890297881303118,\n \"acc_norm\": 0.523121387283237,\n \"acc_norm_stderr\": 0.026890297881303118\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.01424263007057489,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.01424263007057489\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.028607893699576066,\n \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.028607893699576066\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5434083601286174,\n \"acc_stderr\": 0.028290869054197604,\n \"acc_norm\": 0.5434083601286174,\n \"acc_norm_stderr\": 0.028290869054197604\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5493827160493827,\n \"acc_stderr\": 0.027684721415656192,\n \"acc_norm\": 0.5493827160493827,\n \"acc_norm_stderr\": 0.027684721415656192\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.32269503546099293,\n \"acc_stderr\": 0.027889139300534785,\n \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.027889139300534785\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.32790091264667537,\n \"acc_stderr\": 0.011989936640666523,\n \"acc_norm\": 0.32790091264667537,\n \"acc_norm_stderr\": 0.011989936640666523\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.030273325077345755,\n \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.030273325077345755\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.45751633986928103,\n \"acc_stderr\": 0.020154685712590895,\n \"acc_norm\": 0.45751633986928103,\n \"acc_norm_stderr\": 0.020154685712590895\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n \"acc_stderr\": 0.04785964010794915,\n \"acc_norm\": 0.5181818181818182,\n \"acc_norm_stderr\": 0.04785964010794915\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.46530612244897956,\n \"acc_stderr\": 0.03193207024425314,\n \"acc_norm\": 0.46530612244897956,\n \"acc_norm_stderr\": 0.03193207024425314\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5671641791044776,\n \"acc_stderr\": 0.03503490923673282,\n \"acc_norm\": 0.5671641791044776,\n \"acc_norm_stderr\": 0.03503490923673282\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.035282112582452306,\n \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.035282112582452306\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2864137086903305,\n \"mc1_stderr\": 0.015826142439502353,\n \"mc2\": 0.4445304155126561,\n \"mc2_stderr\": 0.015320779641152712\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7198105761641673,\n \"acc_stderr\": 0.012621707979798497\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.16148597422289612,\n \"acc_stderr\": 0.010135959452134323\n }\n}\n```", "repo_url": "https://huggingface.co/abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|arc:challenge|25_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|gsm8k|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hellaswag|10_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T13-57-32.588112.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["**/details_harness|winogrande|5_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T13-57-32.588112.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T13_57_32.588112", "path": ["results_2023-12-13T13-57-32.588112.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T13-57-32.588112.parquet"]}]}]} | 2023-12-13T14:01:13+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad
Dataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T13:57:32.588112(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T13:57:32.588112(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T13:57:32.588112(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
209,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad\n\n\n\nDataset automatically created during the evaluation run of model abdulrahman-nuzha/finetuned-llama2-chat-5000-v1.0-squad on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T13:57:32.588112(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]"
] |
e2fe7e60983baa99ece3580c5519c543c1180fb3 |
# Dataset Card for Evaluation run of Q-bert/Terminis-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Q-bert/Terminis-7B](https://huggingface.co/Q-bert/Terminis-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Q-bert__Terminis-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:00:12.819562](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__Terminis-7B/blob/main/results_2023-12-13T14-00-12.819562.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6432896842220583,
"acc_stderr": 0.03235458873777211,
"acc_norm": 0.6451073101562141,
"acc_norm_stderr": 0.03300952286826437,
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6731465711305621,
"mc2_stderr": 0.015142056894568223
},
"harness|arc:challenge|25": {
"acc": 0.6518771331058021,
"acc_stderr": 0.01392100859517935,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946531
},
"harness|hellaswag|10": {
"acc": 0.6804421429994025,
"acc_stderr": 0.004653523038369371,
"acc_norm": 0.8621788488348935,
"acc_norm_stderr": 0.003440076775300575
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155247,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155247
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5467980295566502,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.5467980295566502,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.028972648884844267,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.028972648884844267
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163265,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807897,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069425,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069425
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.03192193448934724,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.03192193448934724
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4692737430167598,
"acc_stderr": 0.016690896161944385,
"acc_norm": 0.4692737430167598,
"acc_norm_stderr": 0.016690896161944385
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890155,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890155
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4621903520208605,
"acc_stderr": 0.012733671880342506,
"acc_norm": 0.4621903520208605,
"acc_norm_stderr": 0.012733671880342506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.01952431674486635,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.01952431674486635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5214198286413708,
"mc1_stderr": 0.01748743214471164,
"mc2": 0.6731465711305621,
"mc2_stderr": 0.015142056894568223
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242912
},
"harness|gsm8k|5": {
"acc": 0.5754359363153905,
"acc_stderr": 0.013614835574956378
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Q-bert__Terminis-7B | [
"region:us"
] | 2023-12-13T14:03:04+00:00 | {"pretty_name": "Evaluation run of Q-bert/Terminis-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [Q-bert/Terminis-7B](https://huggingface.co/Q-bert/Terminis-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Q-bert__Terminis-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:00:12.819562](https://huggingface.co/datasets/open-llm-leaderboard/details_Q-bert__Terminis-7B/blob/main/results_2023-12-13T14-00-12.819562.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6432896842220583,\n \"acc_stderr\": 0.03235458873777211,\n \"acc_norm\": 0.6451073101562141,\n \"acc_norm_stderr\": 0.03300952286826437,\n \"mc1\": 0.5214198286413708,\n \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6731465711305621,\n \"mc2_stderr\": 0.015142056894568223\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6518771331058021,\n \"acc_stderr\": 0.01392100859517935,\n \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946531\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6804421429994025,\n \"acc_stderr\": 0.004653523038369371,\n \"acc_norm\": 0.8621788488348935,\n \"acc_norm_stderr\": 0.003440076775300575\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.02713429162874171,\n \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.02713429162874171\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155247,\n \"acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155247\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5467980295566502,\n \"acc_stderr\": 0.03502544650845872,\n \"acc_norm\": 0.5467980295566502,\n \"acc_norm_stderr\": 0.03502544650845872\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34444444444444444,\n \"acc_stderr\": 0.028972648884844267,\n \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.028972648884844267\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.029079374539480007,\n \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.029079374539480007\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163265,\n \"acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163265\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807897,\n \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807897\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069425,\n \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069425\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.03192193448934724,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.03192193448934724\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4692737430167598,\n \"acc_stderr\": 0.016690896161944385,\n \"acc_norm\": 0.4692737430167598,\n \"acc_norm_stderr\": 0.016690896161944385\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890155,\n \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890155\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4621903520208605,\n \"acc_stderr\": 0.012733671880342506,\n \"acc_norm\": 0.4621903520208605,\n \"acc_norm_stderr\": 0.012733671880342506\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.01952431674486635,\n \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.01952431674486635\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5214198286413708,\n \"mc1_stderr\": 0.01748743214471164,\n \"mc2\": 0.6731465711305621,\n \"mc2_stderr\": 0.015142056894568223\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242912\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5754359363153905,\n \"acc_stderr\": 0.013614835574956378\n }\n}\n```", "repo_url": "https://huggingface.co/Q-bert/Terminis-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-00-12.819562.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["**/details_harness|winogrande|5_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-00-12.819562.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_00_12.819562", "path": ["results_2023-12-13T14-00-12.819562.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-00-12.819562.parquet"]}]}]} | 2023-12-13T14:03:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Q-bert/Terminis-7B
Dataset automatically created during the evaluation run of model Q-bert/Terminis-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:00:12.819562(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Q-bert/Terminis-7B\n\n\n\nDataset automatically created during the evaluation run of model Q-bert/Terminis-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:00:12.819562(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Q-bert/Terminis-7B\n\n\n\nDataset automatically created during the evaluation run of model Q-bert/Terminis-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:00:12.819562(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
177,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Q-bert/Terminis-7B\n\n\n\nDataset automatically created during the evaluation run of model Q-bert/Terminis-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:00:12.819562(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
1e353c66797da071a9da426d686cc716bfd5e432 |
# Dataset Card for Evaluation run of GreenNode/Merged-DPO-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [GreenNode/Merged-DPO-7B](https://huggingface.co/GreenNode/Merged-DPO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_GreenNode__Merged-DPO-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:00:25.287195](https://huggingface.co/datasets/open-llm-leaderboard/details_GreenNode__Merged-DPO-7B/blob/main/results_2023-12-13T14-00-25.287195.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.560167032409899,
"acc_stderr": 0.034083462253007915,
"acc_norm": 0.5612537132182182,
"acc_norm_stderr": 0.034785117565412534,
"mc1": 0.5899632802937577,
"mc1_stderr": 0.01721784471744932,
"mc2": 0.7276047803006407,
"mc2_stderr": 0.014645147930666262
},
"harness|arc:challenge|25": {
"acc": 0.6604095563139932,
"acc_stderr": 0.013839039762820164,
"acc_norm": 0.689419795221843,
"acc_norm_stderr": 0.013522292098053059
},
"harness|hellaswag|10": {
"acc": 0.7271459868552081,
"acc_stderr": 0.00444516099761836,
"acc_norm": 0.8775144393547102,
"acc_norm_stderr": 0.0032717574453291595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.042992689054808644,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.042992689054808644
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.039531733777491945,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.039531733777491945
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286634,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286634
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.03801685104524458,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.03801685104524458
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.046570472605949625,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.046570472605949625
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.043062412591271526,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.043062412591271526
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6290322580645161,
"acc_stderr": 0.027480541887953593,
"acc_norm": 0.6290322580645161,
"acc_norm_stderr": 0.027480541887953593
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964684,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964684
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8082901554404145,
"acc_stderr": 0.02840895362624529,
"acc_norm": 0.8082901554404145,
"acc_norm_stderr": 0.02840895362624529
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253252,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253252
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.032183581077426124,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.032183581077426124
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7724770642201835,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.7724770642201835,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.031493281045079556,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.031493281045079556
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.628099173553719,
"acc_stderr": 0.04412015806624504,
"acc_norm": 0.628099173553719,
"acc_norm_stderr": 0.04412015806624504
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.047803436269367894,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.047803436269367894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326469,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326469
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009182,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009182
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.015133383278988827,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.015133383278988827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5404624277456648,
"acc_stderr": 0.026830805998952236,
"acc_norm": 0.5404624277456648,
"acc_norm_stderr": 0.026830805998952236
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5163398692810458,
"acc_stderr": 0.028614624752805434,
"acc_norm": 0.5163398692810458,
"acc_norm_stderr": 0.028614624752805434
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6141479099678456,
"acc_stderr": 0.027648149599751464,
"acc_norm": 0.6141479099678456,
"acc_norm_stderr": 0.027648149599751464
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.0293922365846125,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.0293922365846125
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4067796610169492,
"acc_stderr": 0.012546325596569536,
"acc_norm": 0.4067796610169492,
"acc_norm_stderr": 0.012546325596569536
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5661764705882353,
"acc_stderr": 0.030105636570016636,
"acc_norm": 0.5661764705882353,
"acc_norm_stderr": 0.030105636570016636
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5408496732026143,
"acc_stderr": 0.020160213617222516,
"acc_norm": 0.5408496732026143,
"acc_norm_stderr": 0.020160213617222516
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5591836734693878,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.5591836734693878,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5899632802937577,
"mc1_stderr": 0.01721784471744932,
"mc2": 0.7276047803006407,
"mc2_stderr": 0.014645147930666262
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.011570614861409352
},
"harness|gsm8k|5": {
"acc": 0.4518574677786202,
"acc_stderr": 0.013708494995677641
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_GreenNode__Merged-DPO-7B | [
"region:us"
] | 2023-12-13T14:03:16+00:00 | {"pretty_name": "Evaluation run of GreenNode/Merged-DPO-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [GreenNode/Merged-DPO-7B](https://huggingface.co/GreenNode/Merged-DPO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_GreenNode__Merged-DPO-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:00:25.287195](https://huggingface.co/datasets/open-llm-leaderboard/details_GreenNode__Merged-DPO-7B/blob/main/results_2023-12-13T14-00-25.287195.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.560167032409899,\n \"acc_stderr\": 0.034083462253007915,\n \"acc_norm\": 0.5612537132182182,\n \"acc_norm_stderr\": 0.034785117565412534,\n \"mc1\": 0.5899632802937577,\n \"mc1_stderr\": 0.01721784471744932,\n \"mc2\": 0.7276047803006407,\n \"mc2_stderr\": 0.014645147930666262\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6604095563139932,\n \"acc_stderr\": 0.013839039762820164,\n \"acc_norm\": 0.689419795221843,\n \"acc_norm_stderr\": 0.013522292098053059\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7271459868552081,\n \"acc_stderr\": 0.00444516099761836,\n \"acc_norm\": 0.8775144393547102,\n \"acc_norm_stderr\": 0.0032717574453291595\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n \"acc_stderr\": 0.042992689054808644,\n \"acc_norm\": 0.5481481481481482,\n \"acc_norm_stderr\": 0.042992689054808644\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.039531733777491945,\n \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.039531733777491945\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286634,\n \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286634\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n \"acc_stderr\": 0.03801685104524458,\n \"acc_norm\": 0.5375722543352601,\n \"acc_norm_stderr\": 0.03801685104524458\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n \"acc_stderr\": 0.046570472605949625,\n \"acc_norm\": 0.4298245614035088,\n \"acc_norm_stderr\": 0.046570472605949625\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n \"acc_stderr\": 0.043062412591271526,\n \"acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.043062412591271526\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6290322580645161,\n \"acc_stderr\": 0.027480541887953593,\n \"acc_norm\": 0.6290322580645161,\n \"acc_norm_stderr\": 0.027480541887953593\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964684,\n \"acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964684\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8082901554404145,\n \"acc_stderr\": 0.02840895362624529,\n \"acc_norm\": 0.8082901554404145,\n \"acc_norm_stderr\": 0.02840895362624529\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253252,\n \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253252\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.032183581077426124,\n \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.032183581077426124\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7724770642201835,\n \"acc_stderr\": 0.017974463578776502,\n \"acc_norm\": 0.7724770642201835,\n \"acc_norm_stderr\": 0.017974463578776502\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.031493281045079556,\n \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.031493281045079556\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.628099173553719,\n \"acc_stderr\": 0.04412015806624504,\n \"acc_norm\": 0.628099173553719,\n \"acc_norm_stderr\": 0.04412015806624504\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326469,\n \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326469\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n \"acc_stderr\": 0.026853450377009182,\n \"acc_norm\": 0.7863247863247863,\n \"acc_norm_stderr\": 0.026853450377009182\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n \"acc_stderr\": 0.015133383278988827,\n \"acc_norm\": 0.7662835249042146,\n \"acc_norm_stderr\": 0.015133383278988827\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.5404624277456648,\n \"acc_stderr\": 0.026830805998952236,\n \"acc_norm\": 0.5404624277456648,\n \"acc_norm_stderr\": 0.026830805998952236\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.5163398692810458,\n \"acc_stderr\": 0.028614624752805434,\n \"acc_norm\": 0.5163398692810458,\n \"acc_norm_stderr\": 0.028614624752805434\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6141479099678456,\n \"acc_stderr\": 0.027648149599751464,\n \"acc_norm\": 0.6141479099678456,\n \"acc_norm_stderr\": 0.027648149599751464\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4148936170212766,\n \"acc_stderr\": 0.0293922365846125,\n \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.0293922365846125\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4067796610169492,\n \"acc_stderr\": 0.012546325596569536,\n \"acc_norm\": 0.4067796610169492,\n \"acc_norm_stderr\": 0.012546325596569536\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.5661764705882353,\n \"acc_stderr\": 0.030105636570016636,\n \"acc_norm\": 0.5661764705882353,\n \"acc_norm_stderr\": 0.030105636570016636\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.5408496732026143,\n \"acc_stderr\": 0.020160213617222516,\n \"acc_norm\": 0.5408496732026143,\n \"acc_norm_stderr\": 0.020160213617222516\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.5591836734693878,\n \"acc_stderr\": 0.03178419114175363,\n \"acc_norm\": 0.5591836734693878,\n \"acc_norm_stderr\": 0.03178419114175363\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5899632802937577,\n \"mc1_stderr\": 0.01721784471744932,\n \"mc2\": 0.7276047803006407,\n \"mc2_stderr\": 0.014645147930666262\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409352\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4518574677786202,\n \"acc_stderr\": 0.013708494995677641\n }\n}\n```", "repo_url": "https://huggingface.co/GreenNode/Merged-DPO-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-00-25.287195.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["**/details_harness|winogrande|5_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-00-25.287195.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_00_25.287195", "path": ["results_2023-12-13T14-00-25.287195.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-00-25.287195.parquet"]}]}]} | 2023-12-13T14:03:59+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of GreenNode/Merged-DPO-7B
Dataset automatically created during the evaluation run of model GreenNode/Merged-DPO-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:00:25.287195(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of GreenNode/Merged-DPO-7B\n\n\n\nDataset automatically created during the evaluation run of model GreenNode/Merged-DPO-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:00:25.287195(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of GreenNode/Merged-DPO-7B\n\n\n\nDataset automatically created during the evaluation run of model GreenNode/Merged-DPO-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:00:25.287195(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
183,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of GreenNode/Merged-DPO-7B\n\n\n\nDataset automatically created during the evaluation run of model GreenNode/Merged-DPO-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:00:25.287195(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
be3a57f5b914186a4f30a4df3d586213eb664ad0 |
# Dataset Card for Evaluation run of Fredithefish/MadMix-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Fredithefish/MadMix-v0.2](https://huggingface.co/Fredithefish/MadMix-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fredithefish__MadMix-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:05:21.026103](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__MadMix-v0.2/blob/main/results_2023-12-13T14-05-21.026103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6369931523249152,
"acc_stderr": 0.032180405415322286,
"acc_norm": 0.643635199288352,
"acc_norm_stderr": 0.03283580466491587,
"mc1": 0.386780905752754,
"mc1_stderr": 0.017048857010515107,
"mc2": 0.5578972226689095,
"mc2_stderr": 0.015808791087791672
},
"harness|arc:challenge|25": {
"acc": 0.6271331058020477,
"acc_stderr": 0.014131176760131172,
"acc_norm": 0.6484641638225256,
"acc_norm_stderr": 0.013952413699600931
},
"harness|hellaswag|10": {
"acc": 0.6575383389762995,
"acc_stderr": 0.004735632975072384,
"acc_norm": 0.8353913563035252,
"acc_norm_stderr": 0.003700690995600887
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926605,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.02407869658063548,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.02407869658063548
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639318,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.02616056824660146,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.02616056824660146
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.038935425188248475,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.038935425188248475
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406953,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406953
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608313,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608313
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106603,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.02638527370346449,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.02638527370346449
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.01273110279050452,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.01273110279050452
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.017048857010515107,
"mc2": 0.5578972226689095,
"mc2_stderr": 0.015808791087791672
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698329
},
"harness|gsm8k|5": {
"acc": 0.3078089461713419,
"acc_stderr": 0.012714401009923649
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Fredithefish__MadMix-v0.2 | [
"region:us"
] | 2023-12-13T14:08:11+00:00 | {"pretty_name": "Evaluation run of Fredithefish/MadMix-v0.2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Fredithefish/MadMix-v0.2](https://huggingface.co/Fredithefish/MadMix-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__MadMix-v0.2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:05:21.026103](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__MadMix-v0.2/blob/main/results_2023-12-13T14-05-21.026103.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6369931523249152,\n \"acc_stderr\": 0.032180405415322286,\n \"acc_norm\": 0.643635199288352,\n \"acc_norm_stderr\": 0.03283580466491587,\n \"mc1\": 0.386780905752754,\n \"mc1_stderr\": 0.017048857010515107,\n \"mc2\": 0.5578972226689095,\n \"mc2_stderr\": 0.015808791087791672\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6271331058020477,\n \"acc_stderr\": 0.014131176760131172,\n \"acc_norm\": 0.6484641638225256,\n \"acc_norm_stderr\": 0.013952413699600931\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6575383389762995,\n \"acc_stderr\": 0.004735632975072384,\n \"acc_norm\": 0.8353913563035252,\n \"acc_norm_stderr\": 0.003700690995600887\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926605,\n \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926605\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.02407869658063548,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.02407869658063548\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.803921568627451,\n \"acc_stderr\": 0.027865942286639318,\n \"acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.02616056824660146,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.02616056824660146\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.038935425188248475,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.038935425188248475\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406953,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406953\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608313,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608313\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n \"acc_stderr\": 0.016204672385106603,\n \"acc_norm\": 0.376536312849162,\n \"acc_norm_stderr\": 0.016204672385106603\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.02638527370346449,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.02638527370346449\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n \"acc_stderr\": 0.01273110279050452,\n \"acc_norm\": 0.46088657105606257,\n \"acc_norm_stderr\": 0.01273110279050452\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n \"mc1_stderr\": 0.017048857010515107,\n \"mc2\": 0.5578972226689095,\n \"mc2_stderr\": 0.015808791087791672\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698329\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3078089461713419,\n \"acc_stderr\": 0.012714401009923649\n }\n}\n```", "repo_url": "https://huggingface.co/Fredithefish/MadMix-v0.2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-05-21.026103.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["**/details_harness|winogrande|5_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-05-21.026103.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_05_21.026103", "path": ["results_2023-12-13T14-05-21.026103.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-05-21.026103.parquet"]}]}]} | 2023-12-13T14:08:57+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Fredithefish/MadMix-v0.2
Dataset automatically created during the evaluation run of model Fredithefish/MadMix-v0.2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:05:21.026103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Fredithefish/MadMix-v0.2\n\n\n\nDataset automatically created during the evaluation run of model Fredithefish/MadMix-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:05:21.026103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Fredithefish/MadMix-v0.2\n\n\n\nDataset automatically created during the evaluation run of model Fredithefish/MadMix-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:05:21.026103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Fredithefish/MadMix-v0.2\n\n\n\nDataset automatically created during the evaluation run of model Fredithefish/MadMix-v0.2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:05:21.026103(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
d22fba4c004e69e8f1bfea88c99d8e864cdb3643 |
# Dataset Card for Evaluation run of liuda1/Mistral-7B-golden
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liuda1/Mistral-7B-golden](https://huggingface.co/liuda1/Mistral-7B-golden) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liuda1__Mistral-7B-golden",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:07:46.278312](https://huggingface.co/datasets/open-llm-leaderboard/details_liuda1__Mistral-7B-golden/blob/main/results_2023-12-13T14-07-46.278312.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5849031765863516,
"acc_stderr": 0.03309134055134218,
"acc_norm": 0.590618347589519,
"acc_norm_stderr": 0.03382284615639638,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5351003465689416,
"mc2_stderr": 0.01532071134712537
},
"harness|arc:challenge|25": {
"acc": 0.560580204778157,
"acc_stderr": 0.014503747823580122,
"acc_norm": 0.6075085324232082,
"acc_norm_stderr": 0.01426963463567073
},
"harness|hellaswag|10": {
"acc": 0.3542123083051185,
"acc_stderr": 0.004772964697941366,
"acc_norm": 0.4442342162915754,
"acc_norm_stderr": 0.004958649623815338
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.028985455652334388,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.028985455652334388
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3439153439153439,
"acc_stderr": 0.024464426625596433,
"acc_norm": 0.3439153439153439,
"acc_norm_stderr": 0.024464426625596433
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7193548387096774,
"acc_stderr": 0.025560604721022895,
"acc_norm": 0.7193548387096774,
"acc_norm_stderr": 0.025560604721022895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533084,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.025007329882461217,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.025007329882461217
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.044531975073749834,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.044531975073749834
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823288,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823288
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6791907514450867,
"acc_stderr": 0.0251310002336479,
"acc_norm": 0.6791907514450867,
"acc_norm_stderr": 0.0251310002336479
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4122905027932961,
"acc_stderr": 0.016463200238114522,
"acc_norm": 0.4122905027932961,
"acc_norm_stderr": 0.016463200238114522
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.02705797462449438,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.02705797462449438
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579922,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.02946218923337059,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.02946218923337059
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.424380704041721,
"acc_stderr": 0.012623343757430018,
"acc_norm": 0.424380704041721,
"acc_norm_stderr": 0.012623343757430018
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.030320243265004137,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.030320243265004137
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6258169934640523,
"acc_stderr": 0.019576953122088823,
"acc_norm": 0.6258169934640523,
"acc_norm_stderr": 0.019576953122088823
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.031001209039894836,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.031001209039894836
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.031267817146631786,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.031267817146631786
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5351003465689416,
"mc2_stderr": 0.01532071134712537
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183525
},
"harness|gsm8k|5": {
"acc": 0.20318423047763456,
"acc_stderr": 0.011083227665267795
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_liuda1__Mistral-7B-golden | [
"region:us"
] | 2023-12-13T14:10:37+00:00 | {"pretty_name": "Evaluation run of liuda1/Mistral-7B-golden", "dataset_summary": "Dataset automatically created during the evaluation run of model [liuda1/Mistral-7B-golden](https://huggingface.co/liuda1/Mistral-7B-golden) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liuda1__Mistral-7B-golden\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:07:46.278312](https://huggingface.co/datasets/open-llm-leaderboard/details_liuda1__Mistral-7B-golden/blob/main/results_2023-12-13T14-07-46.278312.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5849031765863516,\n \"acc_stderr\": 0.03309134055134218,\n \"acc_norm\": 0.590618347589519,\n \"acc_norm_stderr\": 0.03382284615639638,\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5351003465689416,\n \"mc2_stderr\": 0.01532071134712537\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.560580204778157,\n \"acc_stderr\": 0.014503747823580122,\n \"acc_norm\": 0.6075085324232082,\n \"acc_norm_stderr\": 0.01426963463567073\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.3542123083051185,\n \"acc_stderr\": 0.004772964697941366,\n \"acc_norm\": 0.4442342162915754,\n \"acc_norm_stderr\": 0.004958649623815338\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.028985455652334388,\n \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.028985455652334388\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3439153439153439,\n \"acc_stderr\": 0.024464426625596433,\n \"acc_norm\": 0.3439153439153439,\n \"acc_norm_stderr\": 0.024464426625596433\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7193548387096774,\n \"acc_stderr\": 0.025560604721022895,\n \"acc_norm\": 0.7193548387096774,\n \"acc_norm_stderr\": 0.025560604721022895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533084,\n \"acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533084\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.025007329882461217,\n \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.025007329882461217\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n \"acc_stderr\": 0.044531975073749834,\n \"acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.044531975073749834\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n \"acc_stderr\": 0.014743125394823288,\n \"acc_norm\": 0.7828863346104725,\n \"acc_norm_stderr\": 0.014743125394823288\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6791907514450867,\n \"acc_stderr\": 0.0251310002336479,\n \"acc_norm\": 0.6791907514450867,\n \"acc_norm_stderr\": 0.0251310002336479\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4122905027932961,\n \"acc_stderr\": 0.016463200238114522,\n \"acc_norm\": 0.4122905027932961,\n \"acc_norm_stderr\": 0.016463200238114522\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.02705797462449438,\n \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.02705797462449438\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579922,\n \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579922\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4219858156028369,\n \"acc_stderr\": 0.02946218923337059,\n \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.02946218923337059\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.424380704041721,\n \"acc_stderr\": 0.012623343757430018,\n \"acc_norm\": 0.424380704041721,\n \"acc_norm_stderr\": 0.012623343757430018\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.030320243265004137,\n \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.030320243265004137\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6258169934640523,\n \"acc_stderr\": 0.019576953122088823,\n \"acc_norm\": 0.6258169934640523,\n \"acc_norm_stderr\": 0.019576953122088823\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.031001209039894836,\n \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.031001209039894836\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.031267817146631786,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.031267817146631786\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5351003465689416,\n \"mc2_stderr\": 0.01532071134712537\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183525\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20318423047763456,\n \"acc_stderr\": 0.011083227665267795\n }\n}\n```", "repo_url": "https://huggingface.co/liuda1/Mistral-7B-golden", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-07-46.278312.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["**/details_harness|winogrande|5_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-07-46.278312.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_07_46.278312", "path": ["results_2023-12-13T14-07-46.278312.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-07-46.278312.parquet"]}]}]} | 2023-12-13T14:11:22+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of liuda1/Mistral-7B-golden
Dataset automatically created during the evaluation run of model liuda1/Mistral-7B-golden on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:07:46.278312(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of liuda1/Mistral-7B-golden\n\n\n\nDataset automatically created during the evaluation run of model liuda1/Mistral-7B-golden on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:07:46.278312(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of liuda1/Mistral-7B-golden\n\n\n\nDataset automatically created during the evaluation run of model liuda1/Mistral-7B-golden on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:07:46.278312(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
181,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of liuda1/Mistral-7B-golden\n\n\n\nDataset automatically created during the evaluation run of model liuda1/Mistral-7B-golden on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:07:46.278312(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
74130d5db56381d3c574e18eead374b80ad3ef28 |
# Dataset Card for Evaluation run of AA051610/A13
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [AA051610/A13](https://huggingface.co/AA051610/A13) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_AA051610__A13",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:08:54.129715](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A13/blob/main/results_2023-12-13T14-08-54.129715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6920824964752141,
"acc_stderr": 0.03046911688711296,
"acc_norm": 0.6967692736238253,
"acc_norm_stderr": 0.031060503746157857,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5325324692481855,
"mc2_stderr": 0.015130320422933614
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522075,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6164110734913364,
"acc_stderr": 0.004852658876775387,
"acc_norm": 0.8169687313284206,
"acc_norm_stderr": 0.0038590186619619966
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8026315789473685,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.8026315789473685,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7471698113207547,
"acc_stderr": 0.026749899771241214,
"acc_norm": 0.7471698113207547,
"acc_norm_stderr": 0.026749899771241214
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03309615177059007,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03309615177059007
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7021276595744681,
"acc_stderr": 0.02989614568209546,
"acc_norm": 0.7021276595744681,
"acc_norm_stderr": 0.02989614568209546
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6827586206896552,
"acc_stderr": 0.038783523721386236,
"acc_norm": 0.6827586206896552,
"acc_norm_stderr": 0.038783523721386236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5211640211640212,
"acc_stderr": 0.025728230952130726,
"acc_norm": 0.5211640211640212,
"acc_norm_stderr": 0.025728230952130726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8612903225806452,
"acc_stderr": 0.01966296132141402,
"acc_norm": 0.8612903225806452,
"acc_norm_stderr": 0.01966296132141402
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.541871921182266,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.541871921182266,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656208,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656208
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8787878787878788,
"acc_stderr": 0.023253157951942084,
"acc_norm": 0.8787878787878788,
"acc_norm_stderr": 0.023253157951942084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7487179487179487,
"acc_stderr": 0.021992016662370564,
"acc_norm": 0.7487179487179487,
"acc_norm_stderr": 0.021992016662370564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7983193277310925,
"acc_stderr": 0.02606431340630453,
"acc_norm": 0.7983193277310925,
"acc_norm_stderr": 0.02606431340630453
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.013895729292588963,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.013895729292588963
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8823529411764706,
"acc_stderr": 0.022613286601132012,
"acc_norm": 0.8823529411764706,
"acc_norm_stderr": 0.022613286601132012
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878474,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878474
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.757847533632287,
"acc_stderr": 0.028751392398694755,
"acc_norm": 0.757847533632287,
"acc_norm_stderr": 0.028751392398694755
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.03278548537343138,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.03278548537343138
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.04697113923010213,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.04697113923010213
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9145299145299145,
"acc_stderr": 0.018315891685625838,
"acc_norm": 0.9145299145299145,
"acc_norm_stderr": 0.018315891685625838
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8748403575989783,
"acc_stderr": 0.01183295423930574,
"acc_norm": 0.8748403575989783,
"acc_norm_stderr": 0.01183295423930574
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7687861271676301,
"acc_stderr": 0.02269865716785571,
"acc_norm": 0.7687861271676301,
"acc_norm_stderr": 0.02269865716785571
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3877094972067039,
"acc_stderr": 0.016295332328155814,
"acc_norm": 0.3877094972067039,
"acc_norm_stderr": 0.016295332328155814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.0242886194660461,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.0242886194660461
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7684887459807074,
"acc_stderr": 0.023956532766639133,
"acc_norm": 0.7684887459807074,
"acc_norm_stderr": 0.023956532766639133
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7716049382716049,
"acc_stderr": 0.023358211840626267,
"acc_norm": 0.7716049382716049,
"acc_norm_stderr": 0.023358211840626267
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5602836879432624,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.5602836879432624,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5234680573663625,
"acc_stderr": 0.012756161942523355,
"acc_norm": 0.5234680573663625,
"acc_norm_stderr": 0.012756161942523355
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274053,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274053
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.75,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.75,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399677,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399677
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101696,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101696
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.92,
"acc_stderr": 0.0272659924344291,
"acc_norm": 0.92,
"acc_norm_stderr": 0.0272659924344291
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190793,
"mc2": 0.5325324692481855,
"mc2_stderr": 0.015130320422933614
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569572
},
"harness|gsm8k|5": {
"acc": 0.5269143290371494,
"acc_stderr": 0.013752517189717468
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_AA051610__A13 | [
"region:us"
] | 2023-12-13T14:11:43+00:00 | {"pretty_name": "Evaluation run of AA051610/A13", "dataset_summary": "Dataset automatically created during the evaluation run of model [AA051610/A13](https://huggingface.co/AA051610/A13) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_AA051610__A13\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:08:54.129715](https://huggingface.co/datasets/open-llm-leaderboard/details_AA051610__A13/blob/main/results_2023-12-13T14-08-54.129715.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6920824964752141,\n \"acc_stderr\": 0.03046911688711296,\n \"acc_norm\": 0.6967692736238253,\n \"acc_norm_stderr\": 0.031060503746157857,\n \"mc1\": 0.3806609547123623,\n \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5325324692481855,\n \"mc2_stderr\": 0.015130320422933614\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522075,\n \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6164110734913364,\n \"acc_stderr\": 0.004852658876775387,\n \"acc_norm\": 0.8169687313284206,\n \"acc_norm_stderr\": 0.0038590186619619966\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.8026315789473685,\n \"acc_stderr\": 0.03238981601699397,\n \"acc_norm\": 0.8026315789473685,\n \"acc_norm_stderr\": 0.03238981601699397\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7471698113207547,\n \"acc_stderr\": 0.026749899771241214,\n \"acc_norm\": 0.7471698113207547,\n \"acc_norm_stderr\": 0.026749899771241214\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.03309615177059007,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.03309615177059007\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.7021276595744681,\n \"acc_stderr\": 0.02989614568209546,\n \"acc_norm\": 0.7021276595744681,\n \"acc_norm_stderr\": 0.02989614568209546\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6827586206896552,\n \"acc_stderr\": 0.038783523721386236,\n \"acc_norm\": 0.6827586206896552,\n \"acc_norm_stderr\": 0.038783523721386236\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.5211640211640212,\n \"acc_stderr\": 0.025728230952130726,\n \"acc_norm\": 0.5211640211640212,\n \"acc_norm_stderr\": 0.025728230952130726\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8612903225806452,\n \"acc_stderr\": 0.01966296132141402,\n \"acc_norm\": 0.8612903225806452,\n \"acc_norm_stderr\": 0.01966296132141402\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.541871921182266,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\": 0.541871921182266,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656208,\n \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656208\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8787878787878788,\n \"acc_stderr\": 0.023253157951942084,\n \"acc_norm\": 0.8787878787878788,\n \"acc_norm_stderr\": 0.023253157951942084\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.7487179487179487,\n \"acc_stderr\": 0.021992016662370564,\n \"acc_norm\": 0.7487179487179487,\n \"acc_norm_stderr\": 0.021992016662370564\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7983193277310925,\n \"acc_stderr\": 0.02606431340630453,\n \"acc_norm\": 0.7983193277310925,\n \"acc_norm_stderr\": 0.02606431340630453\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8807339449541285,\n \"acc_stderr\": 0.013895729292588963,\n \"acc_norm\": 0.8807339449541285,\n \"acc_norm_stderr\": 0.013895729292588963\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8823529411764706,\n \"acc_stderr\": 0.022613286601132012,\n \"acc_norm\": 0.8823529411764706,\n \"acc_norm_stderr\": 0.022613286601132012\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878474,\n \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878474\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.757847533632287,\n \"acc_stderr\": 0.028751392398694755,\n \"acc_norm\": 0.757847533632287,\n \"acc_norm_stderr\": 0.028751392398694755\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.03278548537343138,\n \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.03278548537343138\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.04697113923010213,\n \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.04697113923010213\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9145299145299145,\n \"acc_stderr\": 0.018315891685625838,\n \"acc_norm\": 0.9145299145299145,\n \"acc_norm_stderr\": 0.018315891685625838\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8748403575989783,\n \"acc_stderr\": 0.01183295423930574,\n \"acc_norm\": 0.8748403575989783,\n \"acc_norm_stderr\": 0.01183295423930574\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7687861271676301,\n \"acc_stderr\": 0.02269865716785571,\n \"acc_norm\": 0.7687861271676301,\n \"acc_norm_stderr\": 0.02269865716785571\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3877094972067039,\n \"acc_stderr\": 0.016295332328155814,\n \"acc_norm\": 0.3877094972067039,\n \"acc_norm_stderr\": 0.016295332328155814\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.0242886194660461,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.0242886194660461\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7684887459807074,\n \"acc_stderr\": 0.023956532766639133,\n \"acc_norm\": 0.7684887459807074,\n \"acc_norm_stderr\": 0.023956532766639133\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7716049382716049,\n \"acc_stderr\": 0.023358211840626267,\n \"acc_norm\": 0.7716049382716049,\n \"acc_norm_stderr\": 0.023358211840626267\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.5602836879432624,\n \"acc_stderr\": 0.029609912075594106,\n \"acc_norm\": 0.5602836879432624,\n \"acc_norm_stderr\": 0.029609912075594106\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5234680573663625,\n \"acc_stderr\": 0.012756161942523355,\n \"acc_norm\": 0.5234680573663625,\n \"acc_norm_stderr\": 0.012756161942523355\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274053,\n \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274053\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399677,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399677\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n \"acc_stderr\": 0.022509345325101696,\n \"acc_norm\": 0.8855721393034826,\n \"acc_norm_stderr\": 0.022509345325101696\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.92,\n \"acc_stderr\": 0.0272659924344291,\n \"acc_norm\": 0.92,\n \"acc_norm_stderr\": 0.0272659924344291\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n \"mc1_stderr\": 0.01699762787190793,\n \"mc2\": 0.5325324692481855,\n \"mc2_stderr\": 0.015130320422933614\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569572\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5269143290371494,\n \"acc_stderr\": 0.013752517189717468\n }\n}\n```", "repo_url": "https://huggingface.co/AA051610/A13", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-08-54.129715.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["**/details_harness|winogrande|5_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-08-54.129715.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_08_54.129715", "path": ["results_2023-12-13T14-08-54.129715.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-08-54.129715.parquet"]}]}]} | 2023-12-13T14:12:28+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of AA051610/A13
Dataset automatically created during the evaluation run of model AA051610/A13 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:08:54.129715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of AA051610/A13\n\n\n\nDataset automatically created during the evaluation run of model AA051610/A13 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:08:54.129715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of AA051610/A13\n\n\n\nDataset automatically created during the evaluation run of model AA051610/A13 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:08:54.129715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
175,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of AA051610/A13\n\n\n\nDataset automatically created during the evaluation run of model AA051610/A13 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:08:54.129715(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
7591ac04691ace28b48f6a53d6e1de53d1dd39db | # Dataset Card for "norwegian-nli-triplets"
A reformatting of compatible triplets from https://huggingface.co/datasets/tollefj/all-nli-NOB.
This includes all pairs that contain both a contradiction and an entailment.
In cases where a neutral also exists, this is a duplicated triplet, with the same contr./ent.
Simple normalization of sentences:
```python
from neattext.functions import clean_text
import re
def symbol_cleaner(s):
s = re.sub(r"^[^\w\d]+", "", s)
s = re.sub(r"[^\w\d]+$", "", s)
return s
def filter_sent(sent):
sent = clean_text(sent, puncts=False, stopwords=False, non_ascii=False)
sent = symbol_cleaner(sent)
return sent
def filter_triplet(triplet):
return [filter_sent(sent) for sent in triplet]
```
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | tollefj/norwegian-nli-triplets | [
"region:us"
] | 2023-12-13T14:16:48+00:00 | {"dataset_info": {"features": [{"name": "anchor", "dtype": "string"}, {"name": "entailment", "dtype": "string"}, {"name": "contradiction", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 88455406, "num_examples": 551015}], "download_size": 39831572, "dataset_size": 88455406}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-13T14:19:15+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "norwegian-nli-triplets"
A reformatting of compatible triplets from URL
This includes all pairs that contain both a contradiction and an entailment.
In cases where a neutral also exists, this is a duplicated triplet, with the same contr./ent.
Simple normalization of sentences:
More Information needed | [
"# Dataset Card for \"norwegian-nli-triplets\"\n\nA reformatting of compatible triplets from URL\nThis includes all pairs that contain both a contradiction and an entailment.\nIn cases where a neutral also exists, this is a duplicated triplet, with the same contr./ent.\n\nSimple normalization of sentences:\n\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"norwegian-nli-triplets\"\n\nA reformatting of compatible triplets from URL\nThis includes all pairs that contain both a contradiction and an entailment.\nIn cases where a neutral also exists, this is a duplicated triplet, with the same contr./ent.\n\nSimple normalization of sentences:\n\n\nMore Information needed"
] | [
6,
80
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"norwegian-nli-triplets\"\n\nA reformatting of compatible triplets from URL\nThis includes all pairs that contain both a contradiction and an entailment.\nIn cases where a neutral also exists, this is a duplicated triplet, with the same contr./ent.\n\nSimple normalization of sentences:\n\n\nMore Information needed"
] |
bc02cafedc9145d994de1b98e5706dc0b817260a |
# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [v1olet/v1olet_merged_dpo_7B_v3](https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:24:48.868397](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v3/blob/main/results_2023-12-13T14-24-48.868397.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6411740347675706,
"acc_stderr": 0.03228342039008203,
"acc_norm": 0.6407691161331389,
"acc_norm_stderr": 0.03295002376578124,
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.6907171691355769,
"mc2_stderr": 0.015243695704371275
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725223,
"acc_norm": 0.7261092150170648,
"acc_norm_stderr": 0.013032004972989503
},
"harness|hellaswag|10": {
"acc": 0.7143995220075682,
"acc_stderr": 0.004507768029590101,
"acc_norm": 0.8770165305715992,
"acc_norm_stderr": 0.0032774703870227257
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554956,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947408,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467618,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467618
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4547486033519553,
"acc_stderr": 0.016653875777524006,
"acc_norm": 0.4547486033519553,
"acc_norm_stderr": 0.016653875777524006
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.0286619962023353,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.0286619962023353
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696644,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.6907171691355769,
"mc2_stderr": 0.015243695704371275
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918742
},
"harness|gsm8k|5": {
"acc": 0.66868840030326,
"acc_stderr": 0.01296499967968867
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v3 | [
"region:us"
] | 2023-12-13T14:19:38+00:00 | {"pretty_name": "Evaluation run of v1olet/v1olet_merged_dpo_7B_v3", "dataset_summary": "Dataset automatically created during the evaluation run of model [v1olet/v1olet_merged_dpo_7B_v3](https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v3\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:24:48.868397](https://huggingface.co/datasets/open-llm-leaderboard/details_v1olet__v1olet_merged_dpo_7B_v3/blob/main/results_2023-12-13T14-24-48.868397.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6411740347675706,\n \"acc_stderr\": 0.03228342039008203,\n \"acc_norm\": 0.6407691161331389,\n \"acc_norm_stderr\": 0.03295002376578124,\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6907171691355769,\n \"mc2_stderr\": 0.015243695704371275\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725223,\n \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989503\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7143995220075682,\n \"acc_stderr\": 0.004507768029590101,\n \"acc_norm\": 0.8770165305715992,\n \"acc_norm_stderr\": 0.0032774703870227257\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554956,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947408,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n \"acc_stderr\": 0.016653875777524006,\n \"acc_norm\": 0.4547486033519553,\n \"acc_norm_stderr\": 0.016653875777524006\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6907171691355769,\n \"mc2_stderr\": 0.015243695704371275\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918742\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.66868840030326,\n \"acc_stderr\": 0.01296499967968867\n }\n}\n```", "repo_url": "https://huggingface.co/v1olet/v1olet_merged_dpo_7B_v3", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-16-48.443238.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-24-48.868397.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["**/details_harness|winogrande|5_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["**/details_harness|winogrande|5_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-24-48.868397.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_16_48.443238", "path": ["results_2023-12-13T14-16-48.443238.parquet"]}, {"split": "2023_12_13T14_24_48.868397", "path": ["results_2023-12-13T14-24-48.868397.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-24-48.868397.parquet"]}]}]} | 2023-12-13T14:27:45+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v3
Dataset automatically created during the evaluation run of model v1olet/v1olet_merged_dpo_7B_v3 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:24:48.868397(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v3\n\n\n\nDataset automatically created during the evaluation run of model v1olet/v1olet_merged_dpo_7B_v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:24:48.868397(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v3\n\n\n\nDataset automatically created during the evaluation run of model v1olet/v1olet_merged_dpo_7B_v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:24:48.868397(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
203,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of v1olet/v1olet_merged_dpo_7B_v3\n\n\n\nDataset automatically created during the evaluation run of model v1olet/v1olet_merged_dpo_7B_v3 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:24:48.868397(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
fac5849c561e9bf774c09db39f73d44e5f607b56 |
# Dataset Card for Evaluation run of viethq188/Rabbit-7B-v2-DPO-Chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [viethq188/Rabbit-7B-v2-DPO-Chat](https://huggingface.co/viethq188/Rabbit-7B-v2-DPO-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_viethq188__Rabbit-7B-v2-DPO-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:18:20.243457](https://huggingface.co/datasets/open-llm-leaderboard/details_viethq188__Rabbit-7B-v2-DPO-Chat/blob/main/results_2023-12-13T14-18-20.243457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6313831752942103,
"acc_stderr": 0.032826739625957765,
"acc_norm": 0.6334994737335702,
"acc_norm_stderr": 0.033490035767409,
"mc1": 0.49938800489596086,
"mc1_stderr": 0.017503487938892507,
"mc2": 0.6706223675757005,
"mc2_stderr": 0.01506098649001056
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893452,
"acc_norm": 0.6612627986348123,
"acc_norm_stderr": 0.01383056892797433
},
"harness|hellaswag|10": {
"acc": 0.6766580362477594,
"acc_stderr": 0.004667960519938637,
"acc_norm": 0.851822346146186,
"acc_norm_stderr": 0.0035454991695580526
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438662,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438662
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03745554791462456,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03745554791462456
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396262,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396262
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7580645161290323,
"acc_stderr": 0.024362599693031093,
"acc_norm": 0.7580645161290323,
"acc_norm_stderr": 0.024362599693031093
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5369458128078818,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.5369458128078818,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723872,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723872
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.02446861524147892,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.02446861524147892
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.029443169323031537,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.029443169323031537
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291932,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291932
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266854,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266854
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5509259259259259,
"acc_stderr": 0.03392238405321617,
"acc_norm": 0.5509259259259259,
"acc_norm_stderr": 0.03392238405321617
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676177,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676177
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650741,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650741
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316562,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316562
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8020434227330779,
"acc_stderr": 0.014248873549217564,
"acc_norm": 0.8020434227330779,
"acc_norm_stderr": 0.014248873549217564
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.535195530726257,
"acc_stderr": 0.016681020931076637,
"acc_norm": 0.535195530726257,
"acc_norm_stderr": 0.016681020931076637
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464482,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464482
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186806,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532072,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532072
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8713450292397661,
"acc_stderr": 0.025679342723276908,
"acc_norm": 0.8713450292397661,
"acc_norm_stderr": 0.025679342723276908
},
"harness|truthfulqa:mc|0": {
"mc1": 0.49938800489596086,
"mc1_stderr": 0.017503487938892507,
"mc2": 0.6706223675757005,
"mc2_stderr": 0.01506098649001056
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386776
},
"harness|gsm8k|5": {
"acc": 0.5564821834723275,
"acc_stderr": 0.013684327592606165
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_viethq188__Rabbit-7B-v2-DPO-Chat | [
"region:us"
] | 2023-12-13T14:21:11+00:00 | {"pretty_name": "Evaluation run of viethq188/Rabbit-7B-v2-DPO-Chat", "dataset_summary": "Dataset automatically created during the evaluation run of model [viethq188/Rabbit-7B-v2-DPO-Chat](https://huggingface.co/viethq188/Rabbit-7B-v2-DPO-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_viethq188__Rabbit-7B-v2-DPO-Chat\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:18:20.243457](https://huggingface.co/datasets/open-llm-leaderboard/details_viethq188__Rabbit-7B-v2-DPO-Chat/blob/main/results_2023-12-13T14-18-20.243457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6313831752942103,\n \"acc_stderr\": 0.032826739625957765,\n \"acc_norm\": 0.6334994737335702,\n \"acc_norm_stderr\": 0.033490035767409,\n \"mc1\": 0.49938800489596086,\n \"mc1_stderr\": 0.017503487938892507,\n \"mc2\": 0.6706223675757005,\n \"mc2_stderr\": 0.01506098649001056\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893452,\n \"acc_norm\": 0.6612627986348123,\n \"acc_norm_stderr\": 0.01383056892797433\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6766580362477594,\n \"acc_stderr\": 0.004667960519938637,\n \"acc_norm\": 0.851822346146186,\n \"acc_norm_stderr\": 0.0035454991695580526\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438662,\n \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438662\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.03745554791462456,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03745554791462456\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396262,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396262\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7580645161290323,\n \"acc_stderr\": 0.024362599693031093,\n \"acc_norm\": 0.7580645161290323,\n \"acc_norm_stderr\": 0.024362599693031093\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5369458128078818,\n \"acc_stderr\": 0.035083705204426656,\n \"acc_norm\": 0.5369458128078818,\n \"acc_norm_stderr\": 0.035083705204426656\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723872,\n \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723872\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.02446861524147892,\n \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.02446861524147892\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37037037037037035,\n \"acc_stderr\": 0.029443169323031537,\n \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.029443169323031537\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291932,\n \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291932\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.818348623853211,\n \"acc_stderr\": 0.016530617409266854,\n \"acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266854\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5509259259259259,\n \"acc_stderr\": 0.03392238405321617,\n \"acc_norm\": 0.5509259259259259,\n \"acc_norm_stderr\": 0.03392238405321617\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676177,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676177\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.04330043749650741,\n \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.04330043749650741\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8020434227330779,\n \"acc_stderr\": 0.014248873549217564,\n \"acc_norm\": 0.8020434227330779,\n \"acc_norm_stderr\": 0.014248873549217564\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.535195530726257,\n \"acc_stderr\": 0.016681020931076637,\n \"acc_norm\": 0.535195530726257,\n \"acc_norm_stderr\": 0.016681020931076637\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464482,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464482\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186806,\n \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186806\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.012734923579532072,\n \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.012734923579532072\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8713450292397661,\n \"acc_stderr\": 0.025679342723276908,\n \"acc_norm\": 0.8713450292397661,\n \"acc_norm_stderr\": 0.025679342723276908\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.49938800489596086,\n \"mc1_stderr\": 0.017503487938892507,\n \"mc2\": 0.6706223675757005,\n \"mc2_stderr\": 0.01506098649001056\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386776\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5564821834723275,\n \"acc_stderr\": 0.013684327592606165\n }\n}\n```", "repo_url": "https://huggingface.co/viethq188/Rabbit-7B-v2-DPO-Chat", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-18-20.243457.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["**/details_harness|winogrande|5_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-18-20.243457.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_18_20.243457", "path": ["results_2023-12-13T14-18-20.243457.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-18-20.243457.parquet"]}]}]} | 2023-12-13T14:21:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of viethq188/Rabbit-7B-v2-DPO-Chat
Dataset automatically created during the evaluation run of model viethq188/Rabbit-7B-v2-DPO-Chat on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:18:20.243457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of viethq188/Rabbit-7B-v2-DPO-Chat\n\n\n\nDataset automatically created during the evaluation run of model viethq188/Rabbit-7B-v2-DPO-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:18:20.243457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of viethq188/Rabbit-7B-v2-DPO-Chat\n\n\n\nDataset automatically created during the evaluation run of model viethq188/Rabbit-7B-v2-DPO-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:18:20.243457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of viethq188/Rabbit-7B-v2-DPO-Chat\n\n\n\nDataset automatically created during the evaluation run of model viethq188/Rabbit-7B-v2-DPO-Chat on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:18:20.243457(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
eee090aa89e4ca2b6508b0e05c38a56ea1cf3931 |
# Dataset Card for Evaluation run of one-man-army/una-neural-chat-v3-3-P2-OMA
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [one-man-army/una-neural-chat-v3-3-P2-OMA](https://huggingface.co/one-man-army/una-neural-chat-v3-3-P2-OMA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_one-man-army__una-neural-chat-v3-3-P2-OMA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:25:29.170115](https://huggingface.co/datasets/open-llm-leaderboard/details_one-man-army__una-neural-chat-v3-3-P2-OMA/blob/main/results_2023-12-13T14-25-29.170115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6348474564429509,
"acc_stderr": 0.032547874322213524,
"acc_norm": 0.6360060797642175,
"acc_norm_stderr": 0.03320713700726099,
"mc1": 0.4883720930232558,
"mc1_stderr": 0.017498767175740088,
"mc2": 0.6548761858136044,
"mc2_stderr": 0.01508528563797577
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175456,
"acc_norm": 0.6732081911262798,
"acc_norm_stderr": 0.013706665975587331
},
"harness|hellaswag|10": {
"acc": 0.6783509261103365,
"acc_stderr": 0.004661544991583035,
"acc_norm": 0.8632742481577375,
"acc_norm_stderr": 0.00342855459595022
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601688,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601688
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895514,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121427,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121427
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.02439667298509477,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.02439667298509477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.029116617606083015,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.029116617606083015
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.015990154885073393,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.015990154885073393
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728743,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728743
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946315,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946315
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.03351953879521271,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.03351953879521271
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608311,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608311
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429125,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429125
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.026385273703464496,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.026385273703464496
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291456,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291456
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.012659033237067248,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.012659033237067248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.028064998167040094,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.028064998167040094
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4883720930232558,
"mc1_stderr": 0.017498767175740088,
"mc2": 0.6548761858136044,
"mc2_stderr": 0.01508528563797577
},
"harness|winogrande|5": {
"acc": 0.797947908445146,
"acc_stderr": 0.011285013754047444
},
"harness|gsm8k|5": {
"acc": 0.6224412433661866,
"acc_stderr": 0.013353150666358546
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_one-man-army__una-neural-chat-v3-3-P2-OMA | [
"region:us"
] | 2023-12-13T14:25:36+00:00 | {"pretty_name": "Evaluation run of one-man-army/una-neural-chat-v3-3-P2-OMA", "dataset_summary": "Dataset automatically created during the evaluation run of model [one-man-army/una-neural-chat-v3-3-P2-OMA](https://huggingface.co/one-man-army/una-neural-chat-v3-3-P2-OMA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_one-man-army__una-neural-chat-v3-3-P2-OMA\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:25:29.170115](https://huggingface.co/datasets/open-llm-leaderboard/details_one-man-army__una-neural-chat-v3-3-P2-OMA/blob/main/results_2023-12-13T14-25-29.170115.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6348474564429509,\n \"acc_stderr\": 0.032547874322213524,\n \"acc_norm\": 0.6360060797642175,\n \"acc_norm_stderr\": 0.03320713700726099,\n \"mc1\": 0.4883720930232558,\n \"mc1_stderr\": 0.017498767175740088,\n \"mc2\": 0.6548761858136044,\n \"mc2_stderr\": 0.01508528563797577\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175456,\n \"acc_norm\": 0.6732081911262798,\n \"acc_norm_stderr\": 0.013706665975587331\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6783509261103365,\n \"acc_stderr\": 0.004661544991583035,\n \"acc_norm\": 0.8632742481577375,\n \"acc_norm_stderr\": 0.00342855459595022\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.032500536843658404,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.032500536843658404\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601688,\n \"acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601688\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n \"acc_stderr\": 0.024472243840895514,\n \"acc_norm\": 0.7548387096774194,\n \"acc_norm_stderr\": 0.024472243840895514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121427,\n \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121427\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.02439667298509477,\n \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.02439667298509477\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.35185185185185186,\n \"acc_stderr\": 0.029116617606083015,\n \"acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.029116617606083015\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8330275229357799,\n \"acc_stderr\": 0.015990154885073393,\n \"acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.015990154885073393\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728743,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728743\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.042365112580946315,\n \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.042365112580946315\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.03351953879521271,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.03351953879521271\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n \"acc_stderr\": 0.013428186370608311,\n \"acc_norm\": 0.8301404853128991,\n \"acc_norm_stderr\": 0.013428186370608311\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n \"acc_stderr\": 0.016260159604429125,\n \"acc_norm\": 0.38324022346368714,\n \"acc_norm_stderr\": 0.016260159604429125\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n \"acc_stderr\": 0.026385273703464496,\n \"acc_norm\": 0.684887459807074,\n \"acc_norm_stderr\": 0.026385273703464496\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495026,\n \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495026\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291456,\n \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291456\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.028064998167040094,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.028064998167040094\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4883720930232558,\n \"mc1_stderr\": 0.017498767175740088,\n \"mc2\": 0.6548761858136044,\n \"mc2_stderr\": 0.01508528563797577\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.797947908445146,\n \"acc_stderr\": 0.011285013754047444\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6224412433661866,\n \"acc_stderr\": 0.013353150666358546\n }\n}\n```", "repo_url": "https://huggingface.co/one-man-army/una-neural-chat-v3-3-P2-OMA", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-22-46.443158.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-25-29.170115.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["**/details_harness|winogrande|5_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["**/details_harness|winogrande|5_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-25-29.170115.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_22_46.443158", "path": ["results_2023-12-13T14-22-46.443158.parquet"]}, {"split": "2023_12_13T14_25_29.170115", "path": ["results_2023-12-13T14-25-29.170115.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-25-29.170115.parquet"]}]}]} | 2023-12-13T14:29:04+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of one-man-army/una-neural-chat-v3-3-P2-OMA
Dataset automatically created during the evaluation run of model one-man-army/una-neural-chat-v3-3-P2-OMA on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:25:29.170115(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of one-man-army/una-neural-chat-v3-3-P2-OMA\n\n\n\nDataset automatically created during the evaluation run of model one-man-army/una-neural-chat-v3-3-P2-OMA on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:25:29.170115(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of one-man-army/una-neural-chat-v3-3-P2-OMA\n\n\n\nDataset automatically created during the evaluation run of model one-man-army/una-neural-chat-v3-3-P2-OMA on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:25:29.170115(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
201,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of one-man-army/una-neural-chat-v3-3-P2-OMA\n\n\n\nDataset automatically created during the evaluation run of model one-man-army/una-neural-chat-v3-3-P2-OMA on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:25:29.170115(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
80c816d72b3d3b5e49e1de2a184788fad9de5e12 |
# Dataset Card for Evaluation run of FPHam/Writing_Partner_Mistral_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FPHam/Writing_Partner_Mistral_7B](https://huggingface.co/FPHam/Writing_Partner_Mistral_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FPHam__Writing_Partner_Mistral_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:26:14.997635](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Writing_Partner_Mistral_7B/blob/main/results_2023-12-13T14-26-14.997635.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6267475040220933,
"acc_stderr": 0.03244242841582535,
"acc_norm": 0.629552373234452,
"acc_norm_stderr": 0.03308469739933321,
"mc1": 0.31701346389228885,
"mc1_stderr": 0.016289203374403382,
"mc2": 0.48545160584505326,
"mc2_stderr": 0.015262178476338869
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.01426963463567073,
"acc_norm": 0.6459044368600683,
"acc_norm_stderr": 0.013975454122756557
},
"harness|hellaswag|10": {
"acc": 0.6577375024895439,
"acc_stderr": 0.0047349726682996175,
"acc_norm": 0.845947022505477,
"acc_norm_stderr": 0.003602617446641395
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.562962962962963,
"acc_stderr": 0.04284958639753401,
"acc_norm": 0.562962962962963,
"acc_norm_stderr": 0.04284958639753401
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6381578947368421,
"acc_stderr": 0.039105257528497236,
"acc_norm": 0.6381578947368421,
"acc_norm_stderr": 0.039105257528497236
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.02552503438247489,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.02552503438247489
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.024359581465397,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.024359581465397
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.01659525971039932,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.01659525971039932
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809784,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809784
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001512,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001512
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574903,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574903
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6881028938906752,
"acc_stderr": 0.02631185807185416,
"acc_norm": 0.6881028938906752,
"acc_norm_stderr": 0.02631185807185416
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730583,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730583
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170595,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170595
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6584967320261438,
"acc_stderr": 0.019184639328092487,
"acc_norm": 0.6584967320261438,
"acc_norm_stderr": 0.019184639328092487
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197771,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197771
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31701346389228885,
"mc1_stderr": 0.016289203374403382,
"mc2": 0.48545160584505326,
"mc2_stderr": 0.015262178476338869
},
"harness|winogrande|5": {
"acc": 0.7687450670876085,
"acc_stderr": 0.01185004012485051
},
"harness|gsm8k|5": {
"acc": 0.5458680818802123,
"acc_stderr": 0.013714410945264564
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_FPHam__Writing_Partner_Mistral_7B | [
"region:us"
] | 2023-12-13T14:29:05+00:00 | {"pretty_name": "Evaluation run of FPHam/Writing_Partner_Mistral_7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [FPHam/Writing_Partner_Mistral_7B](https://huggingface.co/FPHam/Writing_Partner_Mistral_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FPHam__Writing_Partner_Mistral_7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:26:14.997635](https://huggingface.co/datasets/open-llm-leaderboard/details_FPHam__Writing_Partner_Mistral_7B/blob/main/results_2023-12-13T14-26-14.997635.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6267475040220933,\n \"acc_stderr\": 0.03244242841582535,\n \"acc_norm\": 0.629552373234452,\n \"acc_norm_stderr\": 0.03308469739933321,\n \"mc1\": 0.31701346389228885,\n \"mc1_stderr\": 0.016289203374403382,\n \"mc2\": 0.48545160584505326,\n \"mc2_stderr\": 0.015262178476338869\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.01426963463567073,\n \"acc_norm\": 0.6459044368600683,\n \"acc_norm_stderr\": 0.013975454122756557\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6577375024895439,\n \"acc_stderr\": 0.0047349726682996175,\n \"acc_norm\": 0.845947022505477,\n \"acc_norm_stderr\": 0.003602617446641395\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.562962962962963,\n \"acc_stderr\": 0.04284958639753401,\n \"acc_norm\": 0.562962962962963,\n \"acc_norm_stderr\": 0.04284958639753401\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6381578947368421,\n \"acc_stderr\": 0.039105257528497236,\n \"acc_norm\": 0.6381578947368421,\n \"acc_norm_stderr\": 0.039105257528497236\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.02552503438247489,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.02552503438247489\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.024359581465397,\n \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.024359581465397\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8165137614678899,\n \"acc_stderr\": 0.01659525971039932,\n \"acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.01659525971039932\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.039578354719809784,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.039578354719809784\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n \"acc_stderr\": 0.013927751372001512,\n \"acc_norm\": 0.8135376756066411,\n \"acc_norm_stderr\": 0.013927751372001512\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574903,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574903\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875195,\n \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875195\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6881028938906752,\n \"acc_stderr\": 0.02631185807185416,\n \"acc_norm\": 0.6881028938906752,\n \"acc_norm_stderr\": 0.02631185807185416\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.012737361318730583,\n \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.012737361318730583\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170595,\n \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170595\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6584967320261438,\n \"acc_stderr\": 0.019184639328092487,\n \"acc_norm\": 0.6584967320261438,\n \"acc_norm_stderr\": 0.019184639328092487\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197771,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197771\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n \"mc1_stderr\": 0.016289203374403382,\n \"mc2\": 0.48545160584505326,\n \"mc2_stderr\": 0.015262178476338869\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7687450670876085,\n \"acc_stderr\": 0.01185004012485051\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5458680818802123,\n \"acc_stderr\": 0.013714410945264564\n }\n}\n```", "repo_url": "https://huggingface.co/FPHam/Writing_Partner_Mistral_7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-26-14.997635.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["**/details_harness|winogrande|5_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-26-14.997635.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_26_14.997635", "path": ["results_2023-12-13T14-26-14.997635.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-26-14.997635.parquet"]}]}]} | 2023-12-13T14:29:49+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of FPHam/Writing_Partner_Mistral_7B
Dataset automatically created during the evaluation run of model FPHam/Writing_Partner_Mistral_7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:26:14.997635(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of FPHam/Writing_Partner_Mistral_7B\n\n\n\nDataset automatically created during the evaluation run of model FPHam/Writing_Partner_Mistral_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:26:14.997635(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of FPHam/Writing_Partner_Mistral_7B\n\n\n\nDataset automatically created during the evaluation run of model FPHam/Writing_Partner_Mistral_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:26:14.997635(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of FPHam/Writing_Partner_Mistral_7B\n\n\n\nDataset automatically created during the evaluation run of model FPHam/Writing_Partner_Mistral_7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:26:14.997635(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
65e24406e8646e79db100d1438bf9e5bf38b44fe |
# Dataset Card for Evaluation run of Toten5/Marcoroni-neural-chat-7B-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Toten5/Marcoroni-neural-chat-7B-v2](https://huggingface.co/Toten5/Marcoroni-neural-chat-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Toten5__Marcoroni-neural-chat-7B-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:32:57.370313](https://huggingface.co/datasets/open-llm-leaderboard/details_Toten5__Marcoroni-neural-chat-7B-v2/blob/main/results_2023-12-13T14-32-57.370313.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6513407168373279,
"acc_stderr": 0.032196949241216306,
"acc_norm": 0.6508688978862172,
"acc_norm_stderr": 0.032867694893298695,
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6183630948757572,
"mc2_stderr": 0.015107702805519594
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.013796182947785562,
"acc_norm": 0.6860068259385665,
"acc_norm_stderr": 0.013562691224726304
},
"harness|hellaswag|10": {
"acc": 0.6792471619199363,
"acc_stderr": 0.004658120152230805,
"acc_norm": 0.8632742481577375,
"acc_norm_stderr": 0.00342855459595022
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.025487187147859375,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.025487187147859375
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.0442626668137991,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.0442626668137991
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887037,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887037
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8532110091743119,
"acc_stderr": 0.01517314184512624,
"acc_norm": 0.8532110091743119,
"acc_norm_stderr": 0.01517314184512624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640766,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640766
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.02126271940040697,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.02126271940040697
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834832,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.016542401954631913,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.016542401954631913
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826528,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826528
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4530638852672751,
"acc_stderr": 0.012713845972358981,
"acc_norm": 0.4530638852672751,
"acc_norm_stderr": 0.012713845972358981
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.02619392354445412,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.02619392354445412
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4724602203182375,
"mc1_stderr": 0.017476930190712187,
"mc2": 0.6183630948757572,
"mc2_stderr": 0.015107702805519594
},
"harness|winogrande|5": {
"acc": 0.8042620363062352,
"acc_stderr": 0.011151145042218327
},
"harness|gsm8k|5": {
"acc": 0.731614859742229,
"acc_stderr": 0.012205702688013671
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_Toten5__Marcoroni-neural-chat-7B-v2 | [
"region:us"
] | 2023-12-13T14:35:50+00:00 | {"pretty_name": "Evaluation run of Toten5/Marcoroni-neural-chat-7B-v2", "dataset_summary": "Dataset automatically created during the evaluation run of model [Toten5/Marcoroni-neural-chat-7B-v2](https://huggingface.co/Toten5/Marcoroni-neural-chat-7B-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Toten5__Marcoroni-neural-chat-7B-v2\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:32:57.370313](https://huggingface.co/datasets/open-llm-leaderboard/details_Toten5__Marcoroni-neural-chat-7B-v2/blob/main/results_2023-12-13T14-32-57.370313.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6513407168373279,\n \"acc_stderr\": 0.032196949241216306,\n \"acc_norm\": 0.6508688978862172,\n \"acc_norm_stderr\": 0.032867694893298695,\n \"mc1\": 0.4724602203182375,\n \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6183630948757572,\n \"mc2_stderr\": 0.015107702805519594\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.013796182947785562,\n \"acc_norm\": 0.6860068259385665,\n \"acc_norm_stderr\": 0.013562691224726304\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6792471619199363,\n \"acc_stderr\": 0.004658120152230805,\n \"acc_norm\": 0.8632742481577375,\n \"acc_norm_stderr\": 0.00342855459595022\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.025487187147859375,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.025487187147859375\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.0442626668137991,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.0442626668137991\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887037,\n \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887037\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8532110091743119,\n \"acc_stderr\": 0.01517314184512624,\n \"acc_norm\": 0.8532110091743119,\n \"acc_norm_stderr\": 0.01517314184512624\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640766,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640766\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.02126271940040697,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.02126271940040697\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834832,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834832\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n \"acc_stderr\": 0.016542401954631913,\n \"acc_norm\": 0.42681564245810055,\n \"acc_norm_stderr\": 0.016542401954631913\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826528,\n \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826528\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4530638852672751,\n \"acc_stderr\": 0.012713845972358981,\n \"acc_norm\": 0.4530638852672751,\n \"acc_norm_stderr\": 0.012713845972358981\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.02619392354445412,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.02619392354445412\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4724602203182375,\n \"mc1_stderr\": 0.017476930190712187,\n \"mc2\": 0.6183630948757572,\n \"mc2_stderr\": 0.015107702805519594\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8042620363062352,\n \"acc_stderr\": 0.011151145042218327\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.731614859742229,\n \"acc_stderr\": 0.012205702688013671\n }\n}\n```", "repo_url": "https://huggingface.co/Toten5/Marcoroni-neural-chat-7B-v2", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-32-57.370313.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["**/details_harness|winogrande|5_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-32-57.370313.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_32_57.370313", "path": ["results_2023-12-13T14-32-57.370313.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-32-57.370313.parquet"]}]}]} | 2023-12-13T14:36:33+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of Toten5/Marcoroni-neural-chat-7B-v2
Dataset automatically created during the evaluation run of model Toten5/Marcoroni-neural-chat-7B-v2 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:32:57.370313(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of Toten5/Marcoroni-neural-chat-7B-v2\n\n\n\nDataset automatically created during the evaluation run of model Toten5/Marcoroni-neural-chat-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:32:57.370313(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of Toten5/Marcoroni-neural-chat-7B-v2\n\n\n\nDataset automatically created during the evaluation run of model Toten5/Marcoroni-neural-chat-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:32:57.370313(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of Toten5/Marcoroni-neural-chat-7B-v2\n\n\n\nDataset automatically created during the evaluation run of model Toten5/Marcoroni-neural-chat-7B-v2 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:32:57.370313(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
8744f82724a0267c9e7a94a0c085f778188425be | # malay-conversational-speech-corpus
Mirror for https://magichub.com/datasets/malay-conversational-speech-corpus/, license is Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License | malaysia-ai/malay-conversational-speech-corpus | [
"language:ms",
"region:us"
] | 2023-12-13T14:36:14+00:00 | {"language": ["ms"], "dataset_info": {"features": [{"name": "Y", "dtype": "string"}, {"name": "id", "dtype": "string"}, {"name": "gender", "dtype": "string"}, {"name": "filename", "dtype": {"audio": {"sampling_rate": 16000}}}], "splits": [{"name": "train", "num_bytes": 48785004.736, "num_examples": 3241}], "download_size": 47709555, "dataset_size": 48785004.736}} | 2023-12-13T14:38:21+00:00 | [] | [
"ms"
] | TAGS
#language-Malay (macrolanguage) #region-us
| # malay-conversational-speech-corpus
Mirror for URL license is Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License | [
"# malay-conversational-speech-corpus\n\nMirror for URL license is Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License"
] | [
"TAGS\n#language-Malay (macrolanguage) #region-us \n",
"# malay-conversational-speech-corpus\n\nMirror for URL license is Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License"
] | [
16,
34
] | [
"passage: TAGS\n#language-Malay (macrolanguage) #region-us \n# malay-conversational-speech-corpus\n\nMirror for URL license is Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License"
] |
3b06cc316e4701d119185a66c66f4a960a1693ef |
# Dataset Card for Evaluation run of viethq188/LeoScorpius-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [viethq188/LeoScorpius-7B](https://huggingface.co/viethq188/LeoScorpius-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_viethq188__LeoScorpius-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T14:34:30.167761](https://huggingface.co/datasets/open-llm-leaderboard/details_viethq188__LeoScorpius-7B/blob/main/results_2023-12-13T14-34-30.167761.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6543158814876732,
"acc_stderr": 0.0318871339779985,
"acc_norm": 0.6548884709424827,
"acc_norm_stderr": 0.03253643431079404,
"mc1": 0.4847001223990208,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.6395257433031319,
"mc2_stderr": 0.015177012870117652
},
"harness|arc:challenge|25": {
"acc": 0.6663822525597269,
"acc_stderr": 0.013778687054176538,
"acc_norm": 0.6928327645051194,
"acc_norm_stderr": 0.013481034054980941
},
"harness|hellaswag|10": {
"acc": 0.6919936267675761,
"acc_stderr": 0.004607256752931883,
"acc_norm": 0.8701453893646683,
"acc_norm_stderr": 0.003354564257491871
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741713,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101736,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101736
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.025331202438944437,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.025331202438944437
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933772,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933772
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.02355964698318994,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.02355964698318994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465066,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465066
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978086,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978086
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4681564245810056,
"acc_stderr": 0.016688553415612213,
"acc_norm": 0.4681564245810056,
"acc_norm_stderr": 0.016688553415612213
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4641460234680574,
"acc_stderr": 0.012737361318730581,
"acc_norm": 0.4641460234680574,
"acc_norm_stderr": 0.012737361318730581
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291296,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291296
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090081,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090081
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061463,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061463
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4847001223990208,
"mc1_stderr": 0.017495304473187902,
"mc2": 0.6395257433031319,
"mc2_stderr": 0.015177012870117652
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.01090597811215688
},
"harness|gsm8k|5": {
"acc": 0.6641394996209249,
"acc_stderr": 0.013009224714267369
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_viethq188__LeoScorpius-7B | [
"region:us"
] | 2023-12-13T14:37:23+00:00 | {"pretty_name": "Evaluation run of viethq188/LeoScorpius-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [viethq188/LeoScorpius-7B](https://huggingface.co/viethq188/LeoScorpius-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_viethq188__LeoScorpius-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T14:34:30.167761](https://huggingface.co/datasets/open-llm-leaderboard/details_viethq188__LeoScorpius-7B/blob/main/results_2023-12-13T14-34-30.167761.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543158814876732,\n \"acc_stderr\": 0.0318871339779985,\n \"acc_norm\": 0.6548884709424827,\n \"acc_norm_stderr\": 0.03253643431079404,\n \"mc1\": 0.4847001223990208,\n \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.6395257433031319,\n \"mc2_stderr\": 0.015177012870117652\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6663822525597269,\n \"acc_stderr\": 0.013778687054176538,\n \"acc_norm\": 0.6928327645051194,\n \"acc_norm_stderr\": 0.013481034054980941\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6919936267675761,\n \"acc_stderr\": 0.004607256752931883,\n \"acc_norm\": 0.8701453893646683,\n \"acc_norm_stderr\": 0.003354564257491871\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741713,\n \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741713\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101736,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101736\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.41005291005291006,\n \"acc_stderr\": 0.025331202438944437,\n \"acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.025331202438944437\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933772,\n \"acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933772\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.02355964698318994,\n \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.02355964698318994\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465066,\n \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465066\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978086,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978086\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.02646056956124064,\n \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.02646056956124064\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4681564245810056,\n \"acc_stderr\": 0.016688553415612213,\n \"acc_norm\": 0.4681564245810056,\n \"acc_norm_stderr\": 0.016688553415612213\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4641460234680574,\n \"acc_stderr\": 0.012737361318730581,\n \"acc_norm\": 0.4641460234680574,\n \"acc_norm_stderr\": 0.012737361318730581\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291296,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291296\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n \"acc_stderr\": 0.02411267824090081,\n \"acc_norm\": 0.8656716417910447,\n \"acc_norm_stderr\": 0.02411267824090081\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061463,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061463\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4847001223990208,\n \"mc1_stderr\": 0.017495304473187902,\n \"mc2\": 0.6395257433031319,\n \"mc2_stderr\": 0.015177012870117652\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215688\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6641394996209249,\n \"acc_stderr\": 0.013009224714267369\n }\n}\n```", "repo_url": "https://huggingface.co/viethq188/LeoScorpius-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T14-34-30.167761.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["**/details_harness|winogrande|5_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T14-34-30.167761.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T14_34_30.167761", "path": ["results_2023-12-13T14-34-30.167761.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T14-34-30.167761.parquet"]}]}]} | 2023-12-13T14:38:08+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of viethq188/LeoScorpius-7B
Dataset automatically created during the evaluation run of model viethq188/LeoScorpius-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T14:34:30.167761(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of viethq188/LeoScorpius-7B\n\n\n\nDataset automatically created during the evaluation run of model viethq188/LeoScorpius-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:34:30.167761(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of viethq188/LeoScorpius-7B\n\n\n\nDataset automatically created during the evaluation run of model viethq188/LeoScorpius-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T14:34:30.167761(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
185,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of viethq188/LeoScorpius-7B\n\n\n\nDataset automatically created during the evaluation run of model viethq188/LeoScorpius-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T14:34:30.167761(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
9b3c96ae0adb7744a2c9fc72692842e6b3e25e33 |
# ASLNow!
ASLNow! is a web app designed to make learning ASL fingerspelling easy and fun! You can try it live at [asl-now.vercel.app](https://asl-now.vercel.app/).
Demo: [https://www.youtube.com/watch?v=Wi5tAxVasq8](https://www.youtube.com/watch?v=Wi5tAxVasq8)
## Dataset
This dataset, used to train the fingerspelling model is licensed under the MIT License. It will be updated frequently as more data is collected.
The dataset is collected from multiple participants told to sign ASL letters into a camera and detecting hand landmarks
using
the [Mediapipe Web Hand Landmarker Solution](https://developers.google.com/mediapipe/solutions/vision/hand_landmarker/web_js).
The landmarks are then parsed into a JSON format, and stored in the folder of the class they belong to.
### Format
21 hand landmarks, each composed of `x`, `y` and `z` coordinates. The `x` and `y` coordinates are normalized
to `[0.0, 1.0]` by the
image width and height, respectively. The `z` coordinate represents the landmark depth, with the depth at the wrist
being
the origin. The smaller the value, the closer the landmark is to the camera. The magnitude of `z` uses roughly the same
scale as x.

From: [https://developers.google.com/mediapipe/solutions/vision/hand_landmarker](https://developers.google.com/mediapipe/solutions/vision/hand_landmarker)
Example (`./B/1d20c568-8641-40b6-9c4a-2bff97ab6b49.json`):
```json
[
{
"x": 0.795294463634491,
"y": 0.8062881827354431,
"z": 3.8308681382659415e-7
},
{
"x": 0.7690186500549316,
"y": 0.751120924949646,
"z": -0.019963227212429047
},
...
{
"x": 0.8564801812171936,
"y": 0.5965726375579834,
"z": 0.01904376409947872
},
{
"x": 0.8578274846076965,
"y": 0.5701698064804077,
"z": 0.017703533172607422
}
]
``` | sid220/asl-now-fingerspelling | [
"language:en",
"license:mit",
"doi:10.57967/hf/1494",
"region:us"
] | 2023-12-13T14:42:08+00:00 | {"language": ["en"], "license": "mit", "pretty_name": "asl-now"} | 2023-12-19T00:32:40+00:00 | [] | [
"en"
] | TAGS
#language-English #license-mit #doi-10.57967/hf/1494 #region-us
|
# ASLNow!
ASLNow! is a web app designed to make learning ASL fingerspelling easy and fun! You can try it live at URL.
Demo: URL
## Dataset
This dataset, used to train the fingerspelling model is licensed under the MIT License. It will be updated frequently as more data is collected.
The dataset is collected from multiple participants told to sign ASL letters into a camera and detecting hand landmarks
using
the Mediapipe Web Hand Landmarker Solution.
The landmarks are then parsed into a JSON format, and stored in the folder of the class they belong to.
### Format
21 hand landmarks, each composed of 'x', 'y' and 'z' coordinates. The 'x' and 'y' coordinates are normalized
to '[0.0, 1.0]' by the
image width and height, respectively. The 'z' coordinate represents the landmark depth, with the depth at the wrist
being
the origin. The smaller the value, the closer the landmark is to the camera. The magnitude of 'z' uses roughly the same
scale as x.
!Hand Landmarks
From: URL
Example ('./B/URL'):
| [
"# ASLNow!\nASLNow! is a web app designed to make learning ASL fingerspelling easy and fun! You can try it live at URL.\n\nDemo: URL",
"## Dataset\nThis dataset, used to train the fingerspelling model is licensed under the MIT License. It will be updated frequently as more data is collected.\n\nThe dataset is collected from multiple participants told to sign ASL letters into a camera and detecting hand landmarks\nusing\nthe Mediapipe Web Hand Landmarker Solution.\nThe landmarks are then parsed into a JSON format, and stored in the folder of the class they belong to.",
"### Format\n\n21 hand landmarks, each composed of 'x', 'y' and 'z' coordinates. The 'x' and 'y' coordinates are normalized\nto '[0.0, 1.0]' by the\nimage width and height, respectively. The 'z' coordinate represents the landmark depth, with the depth at the wrist\nbeing\nthe origin. The smaller the value, the closer the landmark is to the camera. The magnitude of 'z' uses roughly the same\nscale as x.\n\n!Hand Landmarks\nFrom: URL\n\nExample ('./B/URL'):"
] | [
"TAGS\n#language-English #license-mit #doi-10.57967/hf/1494 #region-us \n",
"# ASLNow!\nASLNow! is a web app designed to make learning ASL fingerspelling easy and fun! You can try it live at URL.\n\nDemo: URL",
"## Dataset\nThis dataset, used to train the fingerspelling model is licensed under the MIT License. It will be updated frequently as more data is collected.\n\nThe dataset is collected from multiple participants told to sign ASL letters into a camera and detecting hand landmarks\nusing\nthe Mediapipe Web Hand Landmarker Solution.\nThe landmarks are then parsed into a JSON format, and stored in the folder of the class they belong to.",
"### Format\n\n21 hand landmarks, each composed of 'x', 'y' and 'z' coordinates. The 'x' and 'y' coordinates are normalized\nto '[0.0, 1.0]' by the\nimage width and height, respectively. The 'z' coordinate represents the landmark depth, with the depth at the wrist\nbeing\nthe origin. The smaller the value, the closer the landmark is to the camera. The magnitude of 'z' uses roughly the same\nscale as x.\n\n!Hand Landmarks\nFrom: URL\n\nExample ('./B/URL'):"
] | [
27,
37,
98,
135
] | [
"passage: TAGS\n#language-English #license-mit #doi-10.57967/hf/1494 #region-us \n# ASLNow!\nASLNow! is a web app designed to make learning ASL fingerspelling easy and fun! You can try it live at URL.\n\nDemo: URL## Dataset\nThis dataset, used to train the fingerspelling model is licensed under the MIT License. It will be updated frequently as more data is collected.\n\nThe dataset is collected from multiple participants told to sign ASL letters into a camera and detecting hand landmarks\nusing\nthe Mediapipe Web Hand Landmarker Solution.\nThe landmarks are then parsed into a JSON format, and stored in the folder of the class they belong to.### Format\n\n21 hand landmarks, each composed of 'x', 'y' and 'z' coordinates. The 'x' and 'y' coordinates are normalized\nto '[0.0, 1.0]' by the\nimage width and height, respectively. The 'z' coordinate represents the landmark depth, with the depth at the wrist\nbeing\nthe origin. The smaller the value, the closer the landmark is to the camera. The magnitude of 'z' uses roughly the same\nscale as x.\n\n!Hand Landmarks\nFrom: URL\n\nExample ('./B/URL'):"
] |
33b568e27c8d0074d9dbd98a715e32708ccb4634 |
# Scientific Openly-Licensed Publications
This repository contains companion material for the following [publication](https://openaccess.thecvf.com/content/WACV2024/papers/Tarsi_SciOL_and_MuLMS-Img_Introducing_a_Large-Scale_Multimodal_Scientific_Dataset_and_WACV_2024_paper.pdf):
> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. **SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain.** WACV 2024.
Please cite this paper if using the dataset, and direct any questions regarding the dataset
to [Tim Tarsi](mailto:[email protected])
## Summary
Scientific Openly-Licensed Publications (SciOL) is the largest openly-licensed pre-training corpus for multimodal models in the scientific domain, covering multiple sciences including materials science, physics, and computer science. It consists of over 2.7M scientific scientific publications converted into semi-structured data. SciOL contains over 14 Billion tokens of extracted and structured text.
**Note: This repository only contains the textual data of SciOL. For the figures and captions see:**
[SciOL-CI](https://huggingface.co/datasets/Timbrt/SciOL-CI)
## Data Format
We provide the annotations of our dataset in the JSON format. Files are grouped and compressed as zip files. We provide a basic index to find annotations by DOI, PMID or DOAJ id and keywords.
## Annotation Schema
Annotations are structured as in the following schema:
```
{
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"doi": {
"type": "string"
},
"keywords": {
"type": "array",
"items": {
"type": "string"
}
},
"license": {
"type": "string"
},
"article": {
"type": "object",
"properties": {
"title": {
"type": "string"
},
"authors": {
"type": "array",
"items": {
"type": "string"
}
},
"abstract": {
"type": "string"
},
"body_text": {
"type": "string"
},
"bibliography": {
"type": "string"
}
}
}
}
}
```
## Citation
If you use our dataset in your scientific, please cite our paper:
```
@InProceedings{Tarsi_2024_WACV,
author = {Tarsi, Tim and Adel, Heike and Metzen, Jan Hendrik and Zhang, Dan and Finco, Matteo and Friedrich, Annemarie},
title = {SciOL and MuLMS-Img: Introducing a Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
month = {January},
year = {2024},
pages = {4560-4571}
}
```
## License
The SciOL corpus is released under the [CC BY 4.0](https://creativecommons.org/licenses/by/4.0/) license. | Timbrt/SciOL-text | [
"size_categories:10B<n<100B",
"language:en",
"license:cc-by-4.0",
"region:us"
] | 2023-12-13T14:57:23+00:00 | {"language": ["en"], "license": "cc-by-4.0", "size_categories": ["10B<n<100B"], "pretty_name": "Scientific Openly-Licensed Publications - Text"} | 2024-01-01T11:27:39+00:00 | [] | [
"en"
] | TAGS
#size_categories-10B<n<100B #language-English #license-cc-by-4.0 #region-us
|
# Scientific Openly-Licensed Publications
This repository contains companion material for the following publication:
> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain. WACV 2024.
Please cite this paper if using the dataset, and direct any questions regarding the dataset
to Tim Tarsi
## Summary
Scientific Openly-Licensed Publications (SciOL) is the largest openly-licensed pre-training corpus for multimodal models in the scientific domain, covering multiple sciences including materials science, physics, and computer science. It consists of over 2.7M scientific scientific publications converted into semi-structured data. SciOL contains over 14 Billion tokens of extracted and structured text.
Note: This repository only contains the textual data of SciOL. For the figures and captions see:
SciOL-CI
## Data Format
We provide the annotations of our dataset in the JSON format. Files are grouped and compressed as zip files. We provide a basic index to find annotations by DOI, PMID or DOAJ id and keywords.
## Annotation Schema
Annotations are structured as in the following schema:
If you use our dataset in your scientific, please cite our paper:
## License
The SciOL corpus is released under the CC BY 4.0 license. | [
"# Scientific Openly-Licensed Publications\nThis repository contains companion material for the following publication:\n\n> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain. WACV 2024.\n\nPlease cite this paper if using the dataset, and direct any questions regarding the dataset\nto Tim Tarsi",
"## Summary\nScientific Openly-Licensed Publications (SciOL) is the largest openly-licensed pre-training corpus for multimodal models in the scientific domain, covering multiple sciences including materials science, physics, and computer science. It consists of over 2.7M scientific scientific publications converted into semi-structured data. SciOL contains over 14 Billion tokens of extracted and structured text.\n\n\nNote: This repository only contains the textual data of SciOL. For the figures and captions see: \nSciOL-CI",
"## Data Format\nWe provide the annotations of our dataset in the JSON format. Files are grouped and compressed as zip files. We provide a basic index to find annotations by DOI, PMID or DOAJ id and keywords.",
"## Annotation Schema\n\nAnnotations are structured as in the following schema:\n\n\n\nIf you use our dataset in your scientific, please cite our paper:",
"## License\n\nThe SciOL corpus is released under the CC BY 4.0 license."
] | [
"TAGS\n#size_categories-10B<n<100B #language-English #license-cc-by-4.0 #region-us \n",
"# Scientific Openly-Licensed Publications\nThis repository contains companion material for the following publication:\n\n> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain. WACV 2024.\n\nPlease cite this paper if using the dataset, and direct any questions regarding the dataset\nto Tim Tarsi",
"## Summary\nScientific Openly-Licensed Publications (SciOL) is the largest openly-licensed pre-training corpus for multimodal models in the scientific domain, covering multiple sciences including materials science, physics, and computer science. It consists of over 2.7M scientific scientific publications converted into semi-structured data. SciOL contains over 14 Billion tokens of extracted and structured text.\n\n\nNote: This repository only contains the textual data of SciOL. For the figures and captions see: \nSciOL-CI",
"## Data Format\nWe provide the annotations of our dataset in the JSON format. Files are grouped and compressed as zip files. We provide a basic index to find annotations by DOI, PMID or DOAJ id and keywords.",
"## Annotation Schema\n\nAnnotations are structured as in the following schema:\n\n\n\nIf you use our dataset in your scientific, please cite our paper:",
"## License\n\nThe SciOL corpus is released under the CC BY 4.0 license."
] | [
31,
113,
127,
56,
32,
15
] | [
"passage: TAGS\n#size_categories-10B<n<100B #language-English #license-cc-by-4.0 #region-us \n# Scientific Openly-Licensed Publications\nThis repository contains companion material for the following publication:\n\n> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain. WACV 2024.\n\nPlease cite this paper if using the dataset, and direct any questions regarding the dataset\nto Tim Tarsi## Summary\nScientific Openly-Licensed Publications (SciOL) is the largest openly-licensed pre-training corpus for multimodal models in the scientific domain, covering multiple sciences including materials science, physics, and computer science. It consists of over 2.7M scientific scientific publications converted into semi-structured data. SciOL contains over 14 Billion tokens of extracted and structured text.\n\n\nNote: This repository only contains the textual data of SciOL. For the figures and captions see: \nSciOL-CI## Data Format\nWe provide the annotations of our dataset in the JSON format. Files are grouped and compressed as zip files. We provide a basic index to find annotations by DOI, PMID or DOAJ id and keywords.## Annotation Schema\n\nAnnotations are structured as in the following schema:\n\n\n\nIf you use our dataset in your scientific, please cite our paper:## License\n\nThe SciOL corpus is released under the CC BY 4.0 license."
] |
22b093f1981a68a2ddbc531f5076549071f02441 |
# Multi Layer Materials Science Image Corpus
This repository contains companion material for the following [publication](https://openaccess.thecvf.com/content/WACV2024/papers/Tarsi_SciOL_and_MuLMS-Img_Introducing_a_Large-Scale_Multimodal_Scientific_Dataset_and_WACV_2024_paper.pdf):
> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. **SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain.** WACV 2024.
Please cite this paper if using the dataset, and direct any questions regarding the dataset
to [Tim Tarsi](mailto:[email protected])
## Summary
The Multi-Layer Materials Science (MuLMS) corpus [1] is a dataset of 50 scientific publications in the materials science domain annotated for various natural language processing tasks. MuLMS-Img extends this dataset by providing over 14500 high quality, manual annotations for various image-text tasks, e.g., Figure type Classification, Optical Character Recognition (OCR) and Text Role Labeling and Figure Retrieval.
## Data Format
We provide the annotations of our dataset in the JSON format, split into a train, test and dev set. Images are provided as PNG files.
## Annotation Schema
Annotations are structured as in the following schema:
```
{
"$schema": "http://json-schema.org/draft-07/schema#",
"type": "object",
"properties": {
"task1": {
"name": "Chart Classification",
"output": {
"chart_type": {
"type": "string"
}
}
},
"task2": {
"name": "Text Detection and Recognition",
"output": {
"text_blocks": {
"type": "array"
}
}
},
"task3": {
"name": "Image Retrieval",
"output": {
"caption": {
"type": "string"
},
"queries": {
"type": "array"
}
}
}
}
}
```
## Proposed Tasks
In our paper, we introduce the following subtasks and provide human annotations to develop computational models.
**Figure Type Classification** constitutes a multi-class classification task of identifying the type of a figure, e.g., chart types such as bar plots, photographs or illustrations.
**Optical Character Recognition (OCR) and Role Labeling** requires bounding-box detection and transcription of the text within the bounding box, plus identifying the role of the content in the figure, e.g., ticks, legends, or axis labels.
**Figure Retrieval** is based on brief, *search-style* textual queries.
Our aim is to create real-world search queries that might be used in a retrieval system, where the style typically deviates from the descriptive and wordy nature of captions.
## Citation
If you use our dataset in your work, please cite our paper:
```
@InProceedings{Tarsi_2024_WACV,
author = {Tarsi, Tim and Adel, Heike and Metzen, Jan Hendrik and Zhang, Dan and Finco, Matteo and Friedrich, Annemarie},
title = {SciOL and MuLMS-Img: Introducing a Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain},
booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
month = {January},
year = {2024},
pages = {4560-4571}
}
```
## License
The MuLMS-Img corpus is released under the [CC BY-SA 4.0](https://creativecommons.org/licenses/by-sa/4.0/) license.
## References
[1] Timo Pierre Schrader, Matteo Finco, Stefan Grünewald, Felix Hildebrand and Annemarie Friedrich. MuLMS: A Multi-Layer Annotated Text Corpus for Information Extraction in the Materials Science Domain. WIESP 2023. | Timbrt/MuLMS-Img | [
"task_categories:image-classification",
"task_categories:text-to-image",
"task_categories:object-detection",
"size_categories:1K<n<10K",
"language:en",
"license:cc-by-sa-4.0",
"region:us"
] | 2023-12-13T15:15:04+00:00 | {"language": ["en"], "license": "cc-by-sa-4.0", "size_categories": ["1K<n<10K"], "task_categories": ["image-classification", "text-to-image", "object-detection"], "pretty_name": "Multi Layer Materials Science Image Corpus"} | 2024-01-01T11:29:50+00:00 | [] | [
"en"
] | TAGS
#task_categories-image-classification #task_categories-text-to-image #task_categories-object-detection #size_categories-1K<n<10K #language-English #license-cc-by-sa-4.0 #region-us
|
# Multi Layer Materials Science Image Corpus
This repository contains companion material for the following publication:
> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain. WACV 2024.
Please cite this paper if using the dataset, and direct any questions regarding the dataset
to Tim Tarsi
## Summary
The Multi-Layer Materials Science (MuLMS) corpus [1] is a dataset of 50 scientific publications in the materials science domain annotated for various natural language processing tasks. MuLMS-Img extends this dataset by providing over 14500 high quality, manual annotations for various image-text tasks, e.g., Figure type Classification, Optical Character Recognition (OCR) and Text Role Labeling and Figure Retrieval.
## Data Format
We provide the annotations of our dataset in the JSON format, split into a train, test and dev set. Images are provided as PNG files.
## Annotation Schema
Annotations are structured as in the following schema:
## Proposed Tasks
In our paper, we introduce the following subtasks and provide human annotations to develop computational models.
Figure Type Classification constitutes a multi-class classification task of identifying the type of a figure, e.g., chart types such as bar plots, photographs or illustrations.
Optical Character Recognition (OCR) and Role Labeling requires bounding-box detection and transcription of the text within the bounding box, plus identifying the role of the content in the figure, e.g., ticks, legends, or axis labels.
Figure Retrieval is based on brief, *search-style* textual queries.
Our aim is to create real-world search queries that might be used in a retrieval system, where the style typically deviates from the descriptive and wordy nature of captions.
If you use our dataset in your work, please cite our paper:
## License
The MuLMS-Img corpus is released under the CC BY-SA 4.0 license.
## References
[1] Timo Pierre Schrader, Matteo Finco, Stefan Grünewald, Felix Hildebrand and Annemarie Friedrich. MuLMS: A Multi-Layer Annotated Text Corpus for Information Extraction in the Materials Science Domain. WIESP 2023. | [
"# Multi Layer Materials Science Image Corpus\nThis repository contains companion material for the following publication:\n\n> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain. WACV 2024.\n\nPlease cite this paper if using the dataset, and direct any questions regarding the dataset\nto Tim Tarsi",
"## Summary\nThe Multi-Layer Materials Science (MuLMS) corpus [1] is a dataset of 50 scientific publications in the materials science domain annotated for various natural language processing tasks. MuLMS-Img extends this dataset by providing over 14500 high quality, manual annotations for various image-text tasks, e.g., Figure type Classification, Optical Character Recognition (OCR) and Text Role Labeling and Figure Retrieval.",
"## Data Format\nWe provide the annotations of our dataset in the JSON format, split into a train, test and dev set. Images are provided as PNG files.",
"## Annotation Schema\n\nAnnotations are structured as in the following schema:",
"## Proposed Tasks\nIn our paper, we introduce the following subtasks and provide human annotations to develop computational models.\n\nFigure Type Classification constitutes a multi-class classification task of identifying the type of a figure, e.g., chart types such as bar plots, photographs or illustrations.\n\nOptical Character Recognition (OCR) and Role Labeling requires bounding-box detection and transcription of the text within the bounding box, plus identifying the role of the content in the figure, e.g., ticks, legends, or axis labels.\n\nFigure Retrieval is based on brief, *search-style* textual queries.\nOur aim is to create real-world search queries that might be used in a retrieval system, where the style typically deviates from the descriptive and wordy nature of captions.\n\n\nIf you use our dataset in your work, please cite our paper:",
"## License\n\nThe MuLMS-Img corpus is released under the CC BY-SA 4.0 license.",
"## References\n\n[1] Timo Pierre Schrader, Matteo Finco, Stefan Grünewald, Felix Hildebrand and Annemarie Friedrich. MuLMS: A Multi-Layer Annotated Text Corpus for Information Extraction in the Materials Science Domain. WIESP 2023."
] | [
"TAGS\n#task_categories-image-classification #task_categories-text-to-image #task_categories-object-detection #size_categories-1K<n<10K #language-English #license-cc-by-sa-4.0 #region-us \n",
"# Multi Layer Materials Science Image Corpus\nThis repository contains companion material for the following publication:\n\n> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain. WACV 2024.\n\nPlease cite this paper if using the dataset, and direct any questions regarding the dataset\nto Tim Tarsi",
"## Summary\nThe Multi-Layer Materials Science (MuLMS) corpus [1] is a dataset of 50 scientific publications in the materials science domain annotated for various natural language processing tasks. MuLMS-Img extends this dataset by providing over 14500 high quality, manual annotations for various image-text tasks, e.g., Figure type Classification, Optical Character Recognition (OCR) and Text Role Labeling and Figure Retrieval.",
"## Data Format\nWe provide the annotations of our dataset in the JSON format, split into a train, test and dev set. Images are provided as PNG files.",
"## Annotation Schema\n\nAnnotations are structured as in the following schema:",
"## Proposed Tasks\nIn our paper, we introduce the following subtasks and provide human annotations to develop computational models.\n\nFigure Type Classification constitutes a multi-class classification task of identifying the type of a figure, e.g., chart types such as bar plots, photographs or illustrations.\n\nOptical Character Recognition (OCR) and Role Labeling requires bounding-box detection and transcription of the text within the bounding box, plus identifying the role of the content in the figure, e.g., ticks, legends, or axis labels.\n\nFigure Retrieval is based on brief, *search-style* textual queries.\nOur aim is to create real-world search queries that might be used in a retrieval system, where the style typically deviates from the descriptive and wordy nature of captions.\n\n\nIf you use our dataset in your work, please cite our paper:",
"## License\n\nThe MuLMS-Img corpus is released under the CC BY-SA 4.0 license.",
"## References\n\n[1] Timo Pierre Schrader, Matteo Finco, Stefan Grünewald, Felix Hildebrand and Annemarie Friedrich. MuLMS: A Multi-Layer Annotated Text Corpus for Information Extraction in the Materials Science Domain. WIESP 2023."
] | [
67,
112,
111,
37,
17,
212,
21,
57
] | [
"passage: TAGS\n#task_categories-image-classification #task_categories-text-to-image #task_categories-object-detection #size_categories-1K<n<10K #language-English #license-cc-by-sa-4.0 #region-us \n# Multi Layer Materials Science Image Corpus\nThis repository contains companion material for the following publication:\n\n> Tim Tarsi, Heike Adel, Jan Hendrik Metzen, Dan Zhang, Matteo Finco, Annemarie Friedrich. SciOL and MuLMS-Img: Introducing A Large-Scale Multimodal Scientific Dataset and Models for Image-Text Tasks in the Scientific Domain. WACV 2024.\n\nPlease cite this paper if using the dataset, and direct any questions regarding the dataset\nto Tim Tarsi## Summary\nThe Multi-Layer Materials Science (MuLMS) corpus [1] is a dataset of 50 scientific publications in the materials science domain annotated for various natural language processing tasks. MuLMS-Img extends this dataset by providing over 14500 high quality, manual annotations for various image-text tasks, e.g., Figure type Classification, Optical Character Recognition (OCR) and Text Role Labeling and Figure Retrieval.## Data Format\nWe provide the annotations of our dataset in the JSON format, split into a train, test and dev set. Images are provided as PNG files.## Annotation Schema\n\nAnnotations are structured as in the following schema:"
] |
21636c0ae2d2123a60bd2e65efa2b9bef46b731f |
## BSC Dolly 15k EN
Reviewed version from the [Argilla Dolly v2 English version](https://huggingface.co/datasets/argilla/databricks-dolly-15k-curated-multilingual), originally created by [Databricks](https://huggingface.co/datasets/databricks/databricks-dolly-15k).
We provide two subsets: "annotated", where some instances were labelled with potential problems; and "filtered", which only contains the instances without the issues that we observed.
## Annotation process
While analysing the Argilla Dolly v2 English version, we observed the following:
1. Task classification:
- There are three classes with context: 'Closed QA', 'Information Extraction' and 'Summarization'. The rest without context.
- Context is not necessary in all cases and there are instructions that already contain context.
- Incorrect categories (the intention does not always correspond to the category).
-
2. Confusion between "Summarization" and "Open Generative QA" / "Information Extraction" tasks:
- Tasks categorized as "Summarization" have in some cases the intent of "Open Generative QA" / "Information Extraction", and due to their dependency on context, the answer is longer.
3. To note:
- 15,014 examples, half of "QA" type in various formats.
- 70% have no context; when they do, they come from the first part of Wikipedia.
- Many answers are also from Wikipedia.
- Possible improvements in cleaning up text extracted from Wikipedia and handling acronyms.
4. Errors in the dataset:
- Some summaries are longer than the original text.
- Some contexts in "Information Extraction" do not contain the exact information to answer the question asked.
- There are many repeated questions that are kept because the answer is different in each case.
From the previous observations, we performed the following processing:
- Processed "context" column to:
- Remove spellings, citations, or unit conversions inside (parenthesis) and [brackets].
- Removed source webpage links.
- Removed:
- Summary instances where intent is clear & response is longer than context (63)
- Instances where the information is not explicitly mentioned in the context (3)
- Instances with webpage links in the response or instruction (29)
- Exact (instruction/context/response) duplicates (14)
- Instruction/context duplicates (9)
- Instances where instruction is most similar to the response (6)
-
- Changes:
- Some instances in Summarization/Information Extraction/ Closed QA are lacking context after Argilla's curation process. These instances are moved to General QA since they have no longer context and ask about specifics (86).
| BSC-LT/bsc-dolly-15k-en | [
"region:us"
] | 2023-12-13T15:16:21+00:00 | {"dataset_info": [{"config_name": "annotated", "features": [{"name": "id", "dtype": "int64"}, {"name": "category", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "context", "dtype": "string"}, {"name": "labels", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 11901412, "num_examples": 15015}], "download_size": 7553519, "dataset_size": 11901412}, {"config_name": "filtered", "features": [{"name": "id", "dtype": "int64"}, {"name": "category", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "response", "dtype": "string"}, {"name": "context", "dtype": "float64"}, {"name": "labels", "dtype": "float64"}], "splits": [{"name": "train", "num_bytes": 4398990, "num_examples": 10157}], "download_size": 2749289, "dataset_size": 4398990}], "configs": [{"config_name": "annotated", "data_files": [{"split": "train", "path": "annotated/train-*"}]}, {"config_name": "filtered", "data_files": [{"split": "train", "path": "filtered/train-*"}]}]} | 2023-12-13T16:13:36+00:00 | [] | [] | TAGS
#region-us
|
## BSC Dolly 15k EN
Reviewed version from the Argilla Dolly v2 English version, originally created by Databricks.
We provide two subsets: "annotated", where some instances were labelled with potential problems; and "filtered", which only contains the instances without the issues that we observed.
## Annotation process
While analysing the Argilla Dolly v2 English version, we observed the following:
1. Task classification:
- There are three classes with context: 'Closed QA', 'Information Extraction' and 'Summarization'. The rest without context.
- Context is not necessary in all cases and there are instructions that already contain context.
- Incorrect categories (the intention does not always correspond to the category).
-
2. Confusion between "Summarization" and "Open Generative QA" / "Information Extraction" tasks:
- Tasks categorized as "Summarization" have in some cases the intent of "Open Generative QA" / "Information Extraction", and due to their dependency on context, the answer is longer.
3. To note:
- 15,014 examples, half of "QA" type in various formats.
- 70% have no context; when they do, they come from the first part of Wikipedia.
- Many answers are also from Wikipedia.
- Possible improvements in cleaning up text extracted from Wikipedia and handling acronyms.
4. Errors in the dataset:
- Some summaries are longer than the original text.
- Some contexts in "Information Extraction" do not contain the exact information to answer the question asked.
- There are many repeated questions that are kept because the answer is different in each case.
From the previous observations, we performed the following processing:
- Processed "context" column to:
- Remove spellings, citations, or unit conversions inside (parenthesis) and [brackets].
- Removed source webpage links.
- Removed:
- Summary instances where intent is clear & response is longer than context (63)
- Instances where the information is not explicitly mentioned in the context (3)
- Instances with webpage links in the response or instruction (29)
- Exact (instruction/context/response) duplicates (14)
- Instruction/context duplicates (9)
- Instances where instruction is most similar to the response (6)
-
- Changes:
- Some instances in Summarization/Information Extraction/ Closed QA are lacking context after Argilla's curation process. These instances are moved to General QA since they have no longer context and ask about specifics (86).
| [
"## BSC Dolly 15k EN\n\nReviewed version from the Argilla Dolly v2 English version, originally created by Databricks.\n\nWe provide two subsets: \"annotated\", where some instances were labelled with potential problems; and \"filtered\", which only contains the instances without the issues that we observed.",
"## Annotation process\n\nWhile analysing the Argilla Dolly v2 English version, we observed the following:\n1. Task classification:\n - There are three classes with context: 'Closed QA', 'Information Extraction' and 'Summarization'. The rest without context.\n - Context is not necessary in all cases and there are instructions that already contain context.\n - Incorrect categories (the intention does not always correspond to the category).\n - \n2. Confusion between \"Summarization\" and \"Open Generative QA\" / \"Information Extraction\" tasks:\n - Tasks categorized as \"Summarization\" have in some cases the intent of \"Open Generative QA\" / \"Information Extraction\", and due to their dependency on context, the answer is longer.\n\n3. To note:\n - 15,014 examples, half of \"QA\" type in various formats.\n - 70% have no context; when they do, they come from the first part of Wikipedia.\n - Many answers are also from Wikipedia.\n - Possible improvements in cleaning up text extracted from Wikipedia and handling acronyms.\n\n4. Errors in the dataset:\n - Some summaries are longer than the original text.\n - Some contexts in \"Information Extraction\" do not contain the exact information to answer the question asked.\n - There are many repeated questions that are kept because the answer is different in each case.\n\n\nFrom the previous observations, we performed the following processing:\n- Processed \"context\" column to:\n - Remove spellings, citations, or unit conversions inside (parenthesis) and [brackets].\n - Removed source webpage links.\n\n- Removed:\n - Summary instances where intent is clear & response is longer than context (63)\n - Instances where the information is not explicitly mentioned in the context (3)\n - Instances with webpage links in the response or instruction (29)\n - Exact (instruction/context/response) duplicates (14)\n - Instruction/context duplicates (9)\n - Instances where instruction is most similar to the response (6)\n - \n- Changes:\n - Some instances in Summarization/Information Extraction/ Closed QA are lacking context after Argilla's curation process. These instances are moved to General QA since they have no longer context and ask about specifics (86)."
] | [
"TAGS\n#region-us \n",
"## BSC Dolly 15k EN\n\nReviewed version from the Argilla Dolly v2 English version, originally created by Databricks.\n\nWe provide two subsets: \"annotated\", where some instances were labelled with potential problems; and \"filtered\", which only contains the instances without the issues that we observed.",
"## Annotation process\n\nWhile analysing the Argilla Dolly v2 English version, we observed the following:\n1. Task classification:\n - There are three classes with context: 'Closed QA', 'Information Extraction' and 'Summarization'. The rest without context.\n - Context is not necessary in all cases and there are instructions that already contain context.\n - Incorrect categories (the intention does not always correspond to the category).\n - \n2. Confusion between \"Summarization\" and \"Open Generative QA\" / \"Information Extraction\" tasks:\n - Tasks categorized as \"Summarization\" have in some cases the intent of \"Open Generative QA\" / \"Information Extraction\", and due to their dependency on context, the answer is longer.\n\n3. To note:\n - 15,014 examples, half of \"QA\" type in various formats.\n - 70% have no context; when they do, they come from the first part of Wikipedia.\n - Many answers are also from Wikipedia.\n - Possible improvements in cleaning up text extracted from Wikipedia and handling acronyms.\n\n4. Errors in the dataset:\n - Some summaries are longer than the original text.\n - Some contexts in \"Information Extraction\" do not contain the exact information to answer the question asked.\n - There are many repeated questions that are kept because the answer is different in each case.\n\n\nFrom the previous observations, we performed the following processing:\n- Processed \"context\" column to:\n - Remove spellings, citations, or unit conversions inside (parenthesis) and [brackets].\n - Removed source webpage links.\n\n- Removed:\n - Summary instances where intent is clear & response is longer than context (63)\n - Instances where the information is not explicitly mentioned in the context (3)\n - Instances with webpage links in the response or instruction (29)\n - Exact (instruction/context/response) duplicates (14)\n - Instruction/context duplicates (9)\n - Instances where instruction is most similar to the response (6)\n - \n- Changes:\n - Some instances in Summarization/Information Extraction/ Closed QA are lacking context after Argilla's curation process. These instances are moved to General QA since they have no longer context and ask about specifics (86)."
] | [
6,
72,
509
] | [
"passage: TAGS\n#region-us \n## BSC Dolly 15k EN\n\nReviewed version from the Argilla Dolly v2 English version, originally created by Databricks.\n\nWe provide two subsets: \"annotated\", where some instances were labelled with potential problems; and \"filtered\", which only contains the instances without the issues that we observed."
] |
d77dedd28e0145e8c075c15c407946e6164b6aca | This dataset contains the articles from [Ukrainska Pravda](https://www.pravda.com.ua/) of the years 2022-2023, in all translations.
The dataset was created as part of my Master's Thesis, better documentation will follow. For now:
### Basics
One row of the dataset contains an article, title/author/tags in up to three languages (ukr-rus-eng) w/ the corresponding title, author and tags.
Different translations of the same article often have inconsistent tags, so the main `tags` column contains the representations of the tags from all languages (each tag is named after its URI on the UP website).
The mapping of each tag to its URIs and names in all the languages it's present in is fuond in the `tags_mapping.json` file, found in the metadata. The list of URIs for all downloaded articles can be found there as well.
### Files
- Two versions:
- The version 0.0.1 (split name `incomplete`) covers articles from 01.01.2022 until 12.12.2023, kept for now as it's used in some other datasets
- **The version 0.0.2 (split name `train`) is the one you need** and contains all articles from 01.01.2022 till 31.12.2023
- File structure:
- `data/train` is the full 2y 0.0.2 dataset, the one you need
- `data/incomplete` is the old 0.0.1 version
- `metadata/` contains the tags mappings and list of downloaded URIs for both versions
### The rest
- **<https://serhii.net/dtb/2023-12-13-231213-1710-ukrainska-pravda-dataset/>** is the draft of the relevant thesis section
- **[pchr8/up_crawler](https://github.com/pchr8/up_crawler)** is the crawler I wrote to gather this dataset
<br><br>
For any questions, my first name is Serhii, and my email is my_first_name@my_first_name.net.
| shamotskyi/ukr_pravda_2y | [
"multilinguality:multilingual",
"language:uk",
"language:en",
"language:ru",
"license:cc-by-nc-4.0",
"doi:10.57967/hf/1770",
"region:us"
] | 2023-12-13T15:29:04+00:00 | {"language": ["uk", "en", "ru"], "license": "cc-by-nc-4.0", "multilinguality": ["multilingual"], "pretty_name": "Ukrainska Pravda articles in ukr/rus/eng published on or after 01.01.2022"} | 2024-02-15T15:17:59+00:00 | [] | [
"uk",
"en",
"ru"
] | TAGS
#multilinguality-multilingual #language-Ukrainian #language-English #language-Russian #license-cc-by-nc-4.0 #doi-10.57967/hf/1770 #region-us
| This dataset contains the articles from Ukrainska Pravda of the years 2022-2023, in all translations.
The dataset was created as part of my Master's Thesis, better documentation will follow. For now:
### Basics
One row of the dataset contains an article, title/author/tags in up to three languages (ukr-rus-eng) w/ the corresponding title, author and tags.
Different translations of the same article often have inconsistent tags, so the main 'tags' column contains the representations of the tags from all languages (each tag is named after its URI on the UP website).
The mapping of each tag to its URIs and names in all the languages it's present in is fuond in the 'tags_mapping.json' file, found in the metadata. The list of URIs for all downloaded articles can be found there as well.
### Files
- Two versions:
- The version 0.0.1 (split name 'incomplete') covers articles from 01.01.2022 until 12.12.2023, kept for now as it's used in some other datasets
- The version 0.0.2 (split name 'train') is the one you need and contains all articles from 01.01.2022 till 31.12.2023
- File structure:
- 'data/train' is the full 2y 0.0.2 dataset, the one you need
- 'data/incomplete' is the old 0.0.1 version
- 'metadata/' contains the tags mappings and list of downloaded URIs for both versions
### The rest
- <URL is the draft of the relevant thesis section
- pchr8/up_crawler is the crawler I wrote to gather this dataset
<br><br>
For any questions, my first name is Serhii, and my email is my_first_name@my_first_name.net.
| [
"### Basics\n\nOne row of the dataset contains an article, title/author/tags in up to three languages (ukr-rus-eng) w/ the corresponding title, author and tags.\nDifferent translations of the same article often have inconsistent tags, so the main 'tags' column contains the representations of the tags from all languages (each tag is named after its URI on the UP website). \n\nThe mapping of each tag to its URIs and names in all the languages it's present in is fuond in the 'tags_mapping.json' file, found in the metadata. The list of URIs for all downloaded articles can be found there as well.",
"### Files \n\n- Two versions:\n\t- The version 0.0.1 (split name 'incomplete') covers articles from 01.01.2022 until 12.12.2023, kept for now as it's used in some other datasets\n\t- The version 0.0.2 (split name 'train') is the one you need and contains all articles from 01.01.2022 till 31.12.2023\n- File structure:\n\t- 'data/train' is the full 2y 0.0.2 dataset, the one you need\n\t- 'data/incomplete' is the old 0.0.1 version\n\t- 'metadata/' contains the tags mappings and list of downloaded URIs for both versions",
"### The rest\n- <URL is the draft of the relevant thesis section\n- pchr8/up_crawler is the crawler I wrote to gather this dataset\n<br><br>\n\nFor any questions, my first name is Serhii, and my email is my_first_name@my_first_name.net."
] | [
"TAGS\n#multilinguality-multilingual #language-Ukrainian #language-English #language-Russian #license-cc-by-nc-4.0 #doi-10.57967/hf/1770 #region-us \n",
"### Basics\n\nOne row of the dataset contains an article, title/author/tags in up to three languages (ukr-rus-eng) w/ the corresponding title, author and tags.\nDifferent translations of the same article often have inconsistent tags, so the main 'tags' column contains the representations of the tags from all languages (each tag is named after its URI on the UP website). \n\nThe mapping of each tag to its URIs and names in all the languages it's present in is fuond in the 'tags_mapping.json' file, found in the metadata. The list of URIs for all downloaded articles can be found there as well.",
"### Files \n\n- Two versions:\n\t- The version 0.0.1 (split name 'incomplete') covers articles from 01.01.2022 until 12.12.2023, kept for now as it's used in some other datasets\n\t- The version 0.0.2 (split name 'train') is the one you need and contains all articles from 01.01.2022 till 31.12.2023\n- File structure:\n\t- 'data/train' is the full 2y 0.0.2 dataset, the one you need\n\t- 'data/incomplete' is the old 0.0.1 version\n\t- 'metadata/' contains the tags mappings and list of downloaded URIs for both versions",
"### The rest\n- <URL is the draft of the relevant thesis section\n- pchr8/up_crawler is the crawler I wrote to gather this dataset\n<br><br>\n\nFor any questions, my first name is Serhii, and my email is my_first_name@my_first_name.net."
] | [
53,
163,
149,
73
] | [
"passage: TAGS\n#multilinguality-multilingual #language-Ukrainian #language-English #language-Russian #license-cc-by-nc-4.0 #doi-10.57967/hf/1770 #region-us \n### Basics\n\nOne row of the dataset contains an article, title/author/tags in up to three languages (ukr-rus-eng) w/ the corresponding title, author and tags.\nDifferent translations of the same article often have inconsistent tags, so the main 'tags' column contains the representations of the tags from all languages (each tag is named after its URI on the UP website). \n\nThe mapping of each tag to its URIs and names in all the languages it's present in is fuond in the 'tags_mapping.json' file, found in the metadata. The list of URIs for all downloaded articles can be found there as well.### Files \n\n- Two versions:\n\t- The version 0.0.1 (split name 'incomplete') covers articles from 01.01.2022 until 12.12.2023, kept for now as it's used in some other datasets\n\t- The version 0.0.2 (split name 'train') is the one you need and contains all articles from 01.01.2022 till 31.12.2023\n- File structure:\n\t- 'data/train' is the full 2y 0.0.2 dataset, the one you need\n\t- 'data/incomplete' is the old 0.0.1 version\n\t- 'metadata/' contains the tags mappings and list of downloaded URIs for both versions### The rest\n- <URL is the draft of the relevant thesis section\n- pchr8/up_crawler is the crawler I wrote to gather this dataset\n<br><br>\n\nFor any questions, my first name is Serhii, and my email is my_first_name@my_first_name.net."
] |
e2853b58b9190af121ece09013718ff601a1c205 |
# Dataset Card for Evaluation run of abhishek/hepu-o4zf-ravz-7-0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abhishek/hepu-o4zf-ravz-7-0](https://huggingface.co/abhishek/hepu-o4zf-ravz-7-0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abhishek__hepu-o4zf-ravz-7-0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T15:35:44.454119](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__hepu-o4zf-ravz-7-0/blob/main/results_2023-12-13T15-35-44.454119.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23314538989180242,
"acc_stderr": 0.029997889374526535,
"acc_norm": 0.23329981822026483,
"acc_norm_stderr": 0.03078927812586186,
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156487,
"mc2": 0.5166805857294422,
"mc2_stderr": 0.016293087426390157
},
"harness|arc:challenge|25": {
"acc": 0.20477815699658702,
"acc_stderr": 0.011792544338513412,
"acc_norm": 0.24488054607508533,
"acc_norm_stderr": 0.012566273985131358
},
"harness|hellaswag|10": {
"acc": 0.2584146584345748,
"acc_stderr": 0.004368684255626194,
"acc_norm": 0.25363473411670984,
"acc_norm_stderr": 0.004342017709967956
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1925925925925926,
"acc_stderr": 0.03406542058502653,
"acc_norm": 0.1925925925925926,
"acc_norm_stderr": 0.03406542058502653
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.028504856470514203,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.028504856470514203
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.21693121693121692,
"acc_stderr": 0.021227082449445073,
"acc_norm": 0.21693121693121692,
"acc_norm_stderr": 0.021227082449445073
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.17096774193548386,
"acc_stderr": 0.02141724293632157,
"acc_norm": 0.17096774193548386,
"acc_norm_stderr": 0.02141724293632157
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1477832512315271,
"acc_stderr": 0.024969621333521274,
"acc_norm": 0.1477832512315271,
"acc_norm_stderr": 0.024969621333521274
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.02925282329180361,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.02925282329180361
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2,
"acc_stderr": 0.020280805062535722,
"acc_norm": 0.2,
"acc_norm_stderr": 0.020280805062535722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.02519575225182379,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.02519575225182379
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.18543046357615894,
"acc_stderr": 0.03173284384294285,
"acc_norm": 0.18543046357615894,
"acc_norm_stderr": 0.03173284384294285
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2,
"acc_stderr": 0.017149858514250937,
"acc_norm": 0.2,
"acc_norm_stderr": 0.017149858514250937
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.14814814814814814,
"acc_stderr": 0.02422762927372836,
"acc_norm": 0.14814814814814814,
"acc_norm_stderr": 0.02422762927372836
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.029872577708891148,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.029872577708891148
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.23243933588761176,
"acc_stderr": 0.015104550008905702,
"acc_norm": 0.23243933588761176,
"acc_norm_stderr": 0.015104550008905702
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.02405102973991225,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.02405102973991225
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0227797190887334,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0227797190887334
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2426470588235294,
"acc_stderr": 0.026040662474201247,
"acc_norm": 0.2426470588235294,
"acc_norm_stderr": 0.026040662474201247
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25458996328029376,
"mc1_stderr": 0.015250117079156487,
"mc2": 0.5166805857294422,
"mc2_stderr": 0.016293087426390157
},
"harness|winogrande|5": {
"acc": 0.4925019731649566,
"acc_stderr": 0.014050905521228573
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_abhishek__hepu-o4zf-ravz-7-0 | [
"region:us"
] | 2023-12-13T15:38:36+00:00 | {"pretty_name": "Evaluation run of abhishek/hepu-o4zf-ravz-7-0", "dataset_summary": "Dataset automatically created during the evaluation run of model [abhishek/hepu-o4zf-ravz-7-0](https://huggingface.co/abhishek/hepu-o4zf-ravz-7-0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abhishek__hepu-o4zf-ravz-7-0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T15:35:44.454119](https://huggingface.co/datasets/open-llm-leaderboard/details_abhishek__hepu-o4zf-ravz-7-0/blob/main/results_2023-12-13T15-35-44.454119.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23314538989180242,\n \"acc_stderr\": 0.029997889374526535,\n \"acc_norm\": 0.23329981822026483,\n \"acc_norm_stderr\": 0.03078927812586186,\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156487,\n \"mc2\": 0.5166805857294422,\n \"mc2_stderr\": 0.016293087426390157\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.20477815699658702,\n \"acc_stderr\": 0.011792544338513412,\n \"acc_norm\": 0.24488054607508533,\n \"acc_norm_stderr\": 0.012566273985131358\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2584146584345748,\n \"acc_stderr\": 0.004368684255626194,\n \"acc_norm\": 0.25363473411670984,\n \"acc_norm_stderr\": 0.004342017709967956\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1925925925925926,\n \"acc_stderr\": 0.03406542058502653,\n \"acc_norm\": 0.1925925925925926,\n \"acc_norm_stderr\": 0.03406542058502653\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899098,\n \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899098\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.028504856470514203,\n \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.028504856470514203\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.21693121693121692,\n \"acc_stderr\": 0.021227082449445073,\n \"acc_norm\": 0.21693121693121692,\n \"acc_norm_stderr\": 0.021227082449445073\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.17096774193548386,\n \"acc_stderr\": 0.02141724293632157,\n \"acc_norm\": 0.17096774193548386,\n \"acc_norm_stderr\": 0.02141724293632157\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.1477832512315271,\n \"acc_stderr\": 0.024969621333521274,\n \"acc_norm\": 0.1477832512315271,\n \"acc_norm_stderr\": 0.024969621333521274\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.02925282329180361,\n \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.02925282329180361\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.020280805062535722,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.020280805062535722\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.18543046357615894,\n \"acc_stderr\": 0.03173284384294285,\n \"acc_norm\": 0.18543046357615894,\n \"acc_norm_stderr\": 0.03173284384294285\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.2,\n \"acc_stderr\": 0.017149858514250937,\n \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.017149858514250937\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.14814814814814814,\n \"acc_stderr\": 0.02422762927372836,\n \"acc_norm\": 0.14814814814814814,\n \"acc_norm_stderr\": 0.02422762927372836\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.029872577708891148,\n \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.029872577708891148\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.23243933588761176,\n \"acc_stderr\": 0.015104550008905702,\n \"acc_norm\": 0.23243933588761176,\n \"acc_norm_stderr\": 0.015104550008905702\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.02405102973991225,\n \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.02405102973991225\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.21296296296296297,\n \"acc_stderr\": 0.0227797190887334,\n \"acc_norm\": 0.21296296296296297,\n \"acc_norm_stderr\": 0.0227797190887334\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.026040662474201247,\n \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.026040662474201247\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25458996328029376,\n \"mc1_stderr\": 0.015250117079156487,\n \"mc2\": 0.5166805857294422,\n \"mc2_stderr\": 0.016293087426390157\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.4925019731649566,\n \"acc_stderr\": 0.014050905521228573\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```", "repo_url": "https://huggingface.co/abhishek/hepu-o4zf-ravz-7-0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|arc:challenge|25_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|gsm8k|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hellaswag|10_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T15-35-44.454119.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["**/details_harness|winogrande|5_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T15-35-44.454119.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T15_35_44.454119", "path": ["results_2023-12-13T15-35-44.454119.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T15-35-44.454119.parquet"]}]}]} | 2023-12-13T15:39:23+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of abhishek/hepu-o4zf-ravz-7-0
Dataset automatically created during the evaluation run of model abhishek/hepu-o4zf-ravz-7-0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T15:35:44.454119(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of abhishek/hepu-o4zf-ravz-7-0\n\n\n\nDataset automatically created during the evaluation run of model abhishek/hepu-o4zf-ravz-7-0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T15:35:44.454119(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of abhishek/hepu-o4zf-ravz-7-0\n\n\n\nDataset automatically created during the evaluation run of model abhishek/hepu-o4zf-ravz-7-0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T15:35:44.454119(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of abhishek/hepu-o4zf-ravz-7-0\n\n\n\nDataset automatically created during the evaluation run of model abhishek/hepu-o4zf-ravz-7-0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T15:35:44.454119(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
d15b7bf061eca4b59d588ab6303706caac3b6bc6 |
# Dataset Card for Evaluation run of simonveitner/Math-OpenHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [simonveitner/Math-OpenHermes-2.5-Mistral-7B](https://huggingface.co/simonveitner/Math-OpenHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_simonveitner__Math-OpenHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T15:42:58.616928](https://huggingface.co/datasets/open-llm-leaderboard/details_simonveitner__Math-OpenHermes-2.5-Mistral-7B/blob/main/results_2023-12-13T15-42-58.616928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6335216907991242,
"acc_stderr": 0.03232537565689536,
"acc_norm": 0.6354467611227453,
"acc_norm_stderr": 0.0329714707471459,
"mc1": 0.3488372093023256,
"mc1_stderr": 0.01668441985998689,
"mc2": 0.5090845417111513,
"mc2_stderr": 0.015345799600128406
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472437,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491887
},
"harness|hellaswag|10": {
"acc": 0.6413065126468831,
"acc_stderr": 0.004786368011500458,
"acc_norm": 0.8307110137422824,
"acc_norm_stderr": 0.0037424055874098784
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474887,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268545,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268545
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218957,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218957
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.0249393139069408,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.0249393139069408
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064077,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064077
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601446,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699796,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699796
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834827,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.01446589382985993,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.01446589382985993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729474,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729474
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669971,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669971
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.01913994374848704,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.01913994374848704
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.028666857790274648,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.028666857790274648
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3488372093023256,
"mc1_stderr": 0.01668441985998689,
"mc2": 0.5090845417111513,
"mc2_stderr": 0.015345799600128406
},
"harness|winogrande|5": {
"acc": 0.7719021310181531,
"acc_stderr": 0.011793015817663592
},
"harness|gsm8k|5": {
"acc": 0.6110689916603488,
"acc_stderr": 0.01342838248127423
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_simonveitner__Math-OpenHermes-2.5-Mistral-7B | [
"region:us"
] | 2023-12-13T15:45:52+00:00 | {"pretty_name": "Evaluation run of simonveitner/Math-OpenHermes-2.5-Mistral-7B", "dataset_summary": "Dataset automatically created during the evaluation run of model [simonveitner/Math-OpenHermes-2.5-Mistral-7B](https://huggingface.co/simonveitner/Math-OpenHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_simonveitner__Math-OpenHermes-2.5-Mistral-7B\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T15:42:58.616928](https://huggingface.co/datasets/open-llm-leaderboard/details_simonveitner__Math-OpenHermes-2.5-Mistral-7B/blob/main/results_2023-12-13T15-42-58.616928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6335216907991242,\n \"acc_stderr\": 0.03232537565689536,\n \"acc_norm\": 0.6354467611227453,\n \"acc_norm_stderr\": 0.0329714707471459,\n \"mc1\": 0.3488372093023256,\n \"mc1_stderr\": 0.01668441985998689,\n \"mc2\": 0.5090845417111513,\n \"mc2_stderr\": 0.015345799600128406\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472437,\n \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491887\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6413065126468831,\n \"acc_stderr\": 0.004786368011500458,\n \"acc_norm\": 0.8307110137422824,\n \"acc_norm_stderr\": 0.0037424055874098784\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474887,\n \"acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474887\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268545,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268545\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218957,\n \"acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218957\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.0249393139069408,\n \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.0249393139069408\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064077,\n \"acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064077\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601446,\n \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601446\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.030636591348699796,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.030636591348699796\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n \"acc_stderr\": 0.013664230995834827,\n \"acc_norm\": 0.822477650063857,\n \"acc_norm_stderr\": 0.013664230995834827\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n \"acc_stderr\": 0.01446589382985993,\n \"acc_norm\": 0.24916201117318434,\n \"acc_norm_stderr\": 0.01446589382985993\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729474,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729474\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n \"acc_stderr\": 0.012728446067669971,\n \"acc_norm\": 0.4595827900912647,\n \"acc_norm_stderr\": 0.012728446067669971\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.01913994374848704,\n \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.01913994374848704\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.028666857790274648,\n \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.028666857790274648\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3488372093023256,\n \"mc1_stderr\": 0.01668441985998689,\n \"mc2\": 0.5090845417111513,\n \"mc2_stderr\": 0.015345799600128406\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.7719021310181531,\n \"acc_stderr\": 0.011793015817663592\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6110689916603488,\n \"acc_stderr\": 0.01342838248127423\n }\n}\n```", "repo_url": "https://huggingface.co/simonveitner/Math-OpenHermes-2.5-Mistral-7B", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|arc:challenge|25_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|gsm8k|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hellaswag|10_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T15-42-58.616928.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["**/details_harness|winogrande|5_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T15-42-58.616928.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T15_42_58.616928", "path": ["results_2023-12-13T15-42-58.616928.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T15-42-58.616928.parquet"]}]}]} | 2023-12-13T15:46:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of simonveitner/Math-OpenHermes-2.5-Mistral-7B
Dataset automatically created during the evaluation run of model simonveitner/Math-OpenHermes-2.5-Mistral-7B on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T15:42:58.616928(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of simonveitner/Math-OpenHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model simonveitner/Math-OpenHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T15:42:58.616928(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of simonveitner/Math-OpenHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model simonveitner/Math-OpenHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T15:42:58.616928(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of simonveitner/Math-OpenHermes-2.5-Mistral-7B\n\n\n\nDataset automatically created during the evaluation run of model simonveitner/Math-OpenHermes-2.5-Mistral-7B on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T15:42:58.616928(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
59e8981be79ecbeaf5307281a6ab6d6ab3f19752 |
Rich image captioning dataset used for training PixLore model: https://arxiv.org/abs/2312.05349
"image_path" contains the path to the COCO dataset image (change the path accordingly),
"rich_caption" contains the rich caption created using the technique described in the paper.
The rest of the columns are used for debugging or improving the prompt.
| Boni98/PixLore-Rich-Captions | [
"task_categories:image-to-text",
"language:en",
"license:apache-2.0",
"arxiv:2312.05349",
"region:us"
] | 2023-12-13T15:52:51+00:00 | {"language": ["en"], "license": "apache-2.0", "task_categories": ["image-to-text"], "pretty_name": "PixLore Rich Captions"} | 2023-12-13T15:58:26+00:00 | [
"2312.05349"
] | [
"en"
] | TAGS
#task_categories-image-to-text #language-English #license-apache-2.0 #arxiv-2312.05349 #region-us
|
Rich image captioning dataset used for training PixLore model: URL
"image_path" contains the path to the COCO dataset image (change the path accordingly),
"rich_caption" contains the rich caption created using the technique described in the paper.
The rest of the columns are used for debugging or improving the prompt.
| [] | [
"TAGS\n#task_categories-image-to-text #language-English #license-apache-2.0 #arxiv-2312.05349 #region-us \n"
] | [
39
] | [
"passage: TAGS\n#task_categories-image-to-text #language-English #license-apache-2.0 #arxiv-2312.05349 #region-us \n"
] |
b4a5a6ed0f9834b2cf2a38fb52c0dfdf4289aadd |
# Ultrafeedback Curated
This dataset is a curated version of [UltraFeedback](https://huggingface.co/datasets/openbmb/UltraFeedback) dataset performed by Argilla (using [distilabel](https://github.com/argilla-io/distilabel)).
## Introduction
You can take a look at [argilla/ultrafeedback-binarized-preferences](https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences) for more context on the UltraFeedback error, but the following excerpt sums up the problem found:
*After visually browsing around some examples using the sort and filter feature of Argilla (sort by highest rating for chosen responses), we noticed a strong mismatch between the `overall_score` in the original UF dataset (and the Zephyr train_prefs dataset) and the quality of the chosen response.*
*By adding the critique rationale to our Argilla Dataset, we confirmed the critique rationale was highly negative, whereas the rating was very high (the highest in fact: `10`). See screenshot below for one example of this issue. After some quick investigation, we identified hundreds of examples having the same issue and a potential bug on the UltraFeedback repo.*

## Differences with `openbmb/UltraFeedback`
This version of the dataset has replaced the `overall_score` of the responses identified as "wrong", and a new column `updated` to keep track of the updates.
It contains a dict with the following content `{"completion_idx": "the index of the modified completion in the completion list", "distilabel_rationale": "the distilabel rationale"}`, and `None` if nothing was modified.
Other than that, the dataset can be used just like the original.
## Dataset processing
1. Starting from `argilla/ultrafeedback-binarized-curation` we selected all the records with `score_best_overall` equal to 10, as those were the problematic ones.
2. We created a new dataset using the `instruction` and the response from the model with the `best_overall_score_response` to be used with [distilabel](https://github.com/argilla-io/distilabel).
3. Using `gpt-4` and a task for `instruction_following` we obtained a new *rating* and *rationale* of the model for the 2405 "questionable" responses.
```python
import os
from distilabel.llm import OpenAILLM
from distilabel.pipeline import Pipeline
from distilabel.tasks import UltraFeedbackTask
from datasets import load_dataset
# Create the distilabel Pipeline
pipe = Pipeline(
labeller=OpenAILLM(
model="gpt-4",
task=UltraFeedbackTask.for_instruction_following(),
max_new_tokens=256,
num_threads=8,
openai_api_key=os.getenv("OPENAI_API_KEY") or "sk-...",
temperature=0.3,
),
)
# Download the original dataset:
ds = load_dataset("argilla/ultrafeedback-binarized-curation", split="train")
# Prepare the dataset in the format required by distilabel, will need the columns "input" and "generations"
def set_columns_for_distilabel(example):
input = example["instruction"]
generations = example["best_overall_score_response"]["response"]
return {"input": input, "generations": [generations]}
# Filter and prepare the dataset
ds_to_label = ds.filter(lambda ex: ex["score_best_overall"] == 10).map(set_columns_for_distilabel).select_columns(["input", "generations"])
# Label the dataset
ds_labelled = pipe.generate(ds_to_label, num_generations=1, batch_size=8)
```
4. After visual inspection, we decided to remove those answers that were rated as a 1, plus some extra ones rated as 2 and 3, as those were also not a real 10.
The final dataset has a total of 1968 records updated from a 10 to a 1 in the `overall_score` field of the corresponding model (around 3% of the dataset), and a new column "updated" with the rationale of `gpt-4` for the new rating, as well as the index in which the model can be found in the "models" and "completions" columns.
## Reproduce
<a target="_blank" href="https://colab.research.google.com/drive/10R6uxb-Sviv64SyJG2wuWf9cSn6Z1yow?usp=sharing">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
To reproduce the data processing, feel free to run the attached Colab Notebook or just view it at [notebook](./ultrafeedback_curation_distilabel.ipynb) within this repository.
From Argilla we encourage anyone out there to play around, investigate, and experiment with the data, and we firmly believe on open sourcing what we do, as ourselves, as well as the whole community, benefit a lot from open source and we also want to give back.
| argilla/ultrafeedback-curated | [
"task_categories:text-generation",
"size_categories:10K<n<100K",
"language:en",
"license:mit",
"region:us"
] | 2023-12-13T15:56:12+00:00 | {"language": ["en"], "license": "mit", "size_categories": ["10K<n<100K"], "task_categories": ["text-generation"], "pretty_name": "UltraFeedback Curated", "dataset_info": {"features": [{"name": "source", "dtype": "string"}, {"name": "instruction", "dtype": "string"}, {"name": "models", "sequence": "string"}, {"name": "completions", "list": [{"name": "annotations", "struct": [{"name": "helpfulness", "struct": [{"name": "Rating", "dtype": "string"}, {"name": "Rationale", "dtype": "string"}, {"name": "Rationale For Rating", "dtype": "string"}, {"name": "Type", "sequence": "string"}]}, {"name": "honesty", "struct": [{"name": "Rating", "dtype": "string"}, {"name": "Rationale", "dtype": "string"}]}, {"name": "instruction_following", "struct": [{"name": "Rating", "dtype": "string"}, {"name": "Rationale", "dtype": "string"}]}, {"name": "truthfulness", "struct": [{"name": "Rating", "dtype": "string"}, {"name": "Rationale", "dtype": "string"}, {"name": "Rationale For Rating", "dtype": "string"}, {"name": "Type", "sequence": "string"}]}]}, {"name": "critique", "dtype": "string"}, {"name": "custom_system_prompt", "dtype": "string"}, {"name": "model", "dtype": "string"}, {"name": "overall_score", "dtype": "float64"}, {"name": "principle", "dtype": "string"}, {"name": "response", "dtype": "string"}]}, {"name": "correct_answers", "sequence": "string"}, {"name": "incorrect_answers", "sequence": "string"}, {"name": "updated", "struct": [{"name": "completion_idx", "dtype": "int64"}, {"name": "distilabel_rationale", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 843221341, "num_examples": 63967}], "download_size": 321698501, "dataset_size": 843221341}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-13T17:55:55+00:00 | [] | [
"en"
] | TAGS
#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #region-us
|
# Ultrafeedback Curated
This dataset is a curated version of UltraFeedback dataset performed by Argilla (using distilabel).
## Introduction
You can take a look at argilla/ultrafeedback-binarized-preferences for more context on the UltraFeedback error, but the following excerpt sums up the problem found:
*After visually browsing around some examples using the sort and filter feature of Argilla (sort by highest rating for chosen responses), we noticed a strong mismatch between the 'overall_score' in the original UF dataset (and the Zephyr train_prefs dataset) and the quality of the chosen response.*
*By adding the critique rationale to our Argilla Dataset, we confirmed the critique rationale was highly negative, whereas the rating was very high (the highest in fact: '10'). See screenshot below for one example of this issue. After some quick investigation, we identified hundreds of examples having the same issue and a potential bug on the UltraFeedback repo.*
!image/png
## Differences with 'openbmb/UltraFeedback'
This version of the dataset has replaced the 'overall_score' of the responses identified as "wrong", and a new column 'updated' to keep track of the updates.
It contains a dict with the following content '{"completion_idx": "the index of the modified completion in the completion list", "distilabel_rationale": "the distilabel rationale"}', and 'None' if nothing was modified.
Other than that, the dataset can be used just like the original.
## Dataset processing
1. Starting from 'argilla/ultrafeedback-binarized-curation' we selected all the records with 'score_best_overall' equal to 10, as those were the problematic ones.
2. We created a new dataset using the 'instruction' and the response from the model with the 'best_overall_score_response' to be used with distilabel.
3. Using 'gpt-4' and a task for 'instruction_following' we obtained a new *rating* and *rationale* of the model for the 2405 "questionable" responses.
4. After visual inspection, we decided to remove those answers that were rated as a 1, plus some extra ones rated as 2 and 3, as those were also not a real 10.
The final dataset has a total of 1968 records updated from a 10 to a 1 in the 'overall_score' field of the corresponding model (around 3% of the dataset), and a new column "updated" with the rationale of 'gpt-4' for the new rating, as well as the index in which the model can be found in the "models" and "completions" columns.
## Reproduce
<a target="_blank" href="URL
<img src="URL alt="Open In Colab"/>
</a>
To reproduce the data processing, feel free to run the attached Colab Notebook or just view it at notebook within this repository.
From Argilla we encourage anyone out there to play around, investigate, and experiment with the data, and we firmly believe on open sourcing what we do, as ourselves, as well as the whole community, benefit a lot from open source and we also want to give back.
| [
"# Ultrafeedback Curated\n\nThis dataset is a curated version of UltraFeedback dataset performed by Argilla (using distilabel).",
"## Introduction\n\nYou can take a look at argilla/ultrafeedback-binarized-preferences for more context on the UltraFeedback error, but the following excerpt sums up the problem found:\n\n*After visually browsing around some examples using the sort and filter feature of Argilla (sort by highest rating for chosen responses), we noticed a strong mismatch between the 'overall_score' in the original UF dataset (and the Zephyr train_prefs dataset) and the quality of the chosen response.*\n\n*By adding the critique rationale to our Argilla Dataset, we confirmed the critique rationale was highly negative, whereas the rating was very high (the highest in fact: '10'). See screenshot below for one example of this issue. After some quick investigation, we identified hundreds of examples having the same issue and a potential bug on the UltraFeedback repo.*\n\n!image/png",
"## Differences with 'openbmb/UltraFeedback'\n\nThis version of the dataset has replaced the 'overall_score' of the responses identified as \"wrong\", and a new column 'updated' to keep track of the updates.\nIt contains a dict with the following content '{\"completion_idx\": \"the index of the modified completion in the completion list\", \"distilabel_rationale\": \"the distilabel rationale\"}', and 'None' if nothing was modified.\nOther than that, the dataset can be used just like the original.",
"## Dataset processing\n\n1. Starting from 'argilla/ultrafeedback-binarized-curation' we selected all the records with 'score_best_overall' equal to 10, as those were the problematic ones.\n2. We created a new dataset using the 'instruction' and the response from the model with the 'best_overall_score_response' to be used with distilabel.\n3. Using 'gpt-4' and a task for 'instruction_following' we obtained a new *rating* and *rationale* of the model for the 2405 \"questionable\" responses.\n\n\n\n4. After visual inspection, we decided to remove those answers that were rated as a 1, plus some extra ones rated as 2 and 3, as those were also not a real 10.\n\nThe final dataset has a total of 1968 records updated from a 10 to a 1 in the 'overall_score' field of the corresponding model (around 3% of the dataset), and a new column \"updated\" with the rationale of 'gpt-4' for the new rating, as well as the index in which the model can be found in the \"models\" and \"completions\" columns.",
"## Reproduce\n\n<a target=\"_blank\" href=\"URL\n <img src=\"URL alt=\"Open In Colab\"/>\n</a>\n\nTo reproduce the data processing, feel free to run the attached Colab Notebook or just view it at notebook within this repository.\n\nFrom Argilla we encourage anyone out there to play around, investigate, and experiment with the data, and we firmly believe on open sourcing what we do, as ourselves, as well as the whole community, benefit a lot from open source and we also want to give back."
] | [
"TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #region-us \n",
"# Ultrafeedback Curated\n\nThis dataset is a curated version of UltraFeedback dataset performed by Argilla (using distilabel).",
"## Introduction\n\nYou can take a look at argilla/ultrafeedback-binarized-preferences for more context on the UltraFeedback error, but the following excerpt sums up the problem found:\n\n*After visually browsing around some examples using the sort and filter feature of Argilla (sort by highest rating for chosen responses), we noticed a strong mismatch between the 'overall_score' in the original UF dataset (and the Zephyr train_prefs dataset) and the quality of the chosen response.*\n\n*By adding the critique rationale to our Argilla Dataset, we confirmed the critique rationale was highly negative, whereas the rating was very high (the highest in fact: '10'). See screenshot below for one example of this issue. After some quick investigation, we identified hundreds of examples having the same issue and a potential bug on the UltraFeedback repo.*\n\n!image/png",
"## Differences with 'openbmb/UltraFeedback'\n\nThis version of the dataset has replaced the 'overall_score' of the responses identified as \"wrong\", and a new column 'updated' to keep track of the updates.\nIt contains a dict with the following content '{\"completion_idx\": \"the index of the modified completion in the completion list\", \"distilabel_rationale\": \"the distilabel rationale\"}', and 'None' if nothing was modified.\nOther than that, the dataset can be used just like the original.",
"## Dataset processing\n\n1. Starting from 'argilla/ultrafeedback-binarized-curation' we selected all the records with 'score_best_overall' equal to 10, as those were the problematic ones.\n2. We created a new dataset using the 'instruction' and the response from the model with the 'best_overall_score_response' to be used with distilabel.\n3. Using 'gpt-4' and a task for 'instruction_following' we obtained a new *rating* and *rationale* of the model for the 2405 \"questionable\" responses.\n\n\n\n4. After visual inspection, we decided to remove those answers that were rated as a 1, plus some extra ones rated as 2 and 3, as those were also not a real 10.\n\nThe final dataset has a total of 1968 records updated from a 10 to a 1 in the 'overall_score' field of the corresponding model (around 3% of the dataset), and a new column \"updated\" with the rationale of 'gpt-4' for the new rating, as well as the index in which the model can be found in the \"models\" and \"completions\" columns.",
"## Reproduce\n\n<a target=\"_blank\" href=\"URL\n <img src=\"URL alt=\"Open In Colab\"/>\n</a>\n\nTo reproduce the data processing, feel free to run the attached Colab Notebook or just view it at notebook within this repository.\n\nFrom Argilla we encourage anyone out there to play around, investigate, and experiment with the data, and we firmly believe on open sourcing what we do, as ourselves, as well as the whole community, benefit a lot from open source and we also want to give back."
] | [
38,
33,
208,
142,
271,
121
] | [
"passage: TAGS\n#task_categories-text-generation #size_categories-10K<n<100K #language-English #license-mit #region-us \n# Ultrafeedback Curated\n\nThis dataset is a curated version of UltraFeedback dataset performed by Argilla (using distilabel).## Introduction\n\nYou can take a look at argilla/ultrafeedback-binarized-preferences for more context on the UltraFeedback error, but the following excerpt sums up the problem found:\n\n*After visually browsing around some examples using the sort and filter feature of Argilla (sort by highest rating for chosen responses), we noticed a strong mismatch between the 'overall_score' in the original UF dataset (and the Zephyr train_prefs dataset) and the quality of the chosen response.*\n\n*By adding the critique rationale to our Argilla Dataset, we confirmed the critique rationale was highly negative, whereas the rating was very high (the highest in fact: '10'). See screenshot below for one example of this issue. After some quick investigation, we identified hundreds of examples having the same issue and a potential bug on the UltraFeedback repo.*\n\n!image/png## Differences with 'openbmb/UltraFeedback'\n\nThis version of the dataset has replaced the 'overall_score' of the responses identified as \"wrong\", and a new column 'updated' to keep track of the updates.\nIt contains a dict with the following content '{\"completion_idx\": \"the index of the modified completion in the completion list\", \"distilabel_rationale\": \"the distilabel rationale\"}', and 'None' if nothing was modified.\nOther than that, the dataset can be used just like the original."
] |
01021f1d74b3401dac81516f2506d4689f230077 |
This is a copy of DRAL-2.0 dataset (https://www.cs.utep.edu/nigel/dral/), with only short fragments and only those where the audio on the left is original.
| cointegrated/dral_lite | [
"region:us"
] | 2023-12-13T15:57:18+00:00 | {"dataset_info": {"features": [{"name": "id", "dtype": "string"}, {"name": "conv_id", "dtype": "string"}, {"name": "original_or_reenacted", "dtype": "string"}, {"name": "time_start", "dtype": "string"}, {"name": "time_end", "dtype": "string"}, {"name": "duration", "dtype": "string"}, {"name": "trans_id", "dtype": "string"}, {"name": "wav_orig", "sequence": "float32"}, {"name": "wav_trans", "sequence": "float32"}, {"name": "text_orig", "dtype": "string"}, {"name": "text_trans", "dtype": "string"}], "splits": [{"name": "v2_short", "num_bytes": 347282073, "num_examples": 1173}], "download_size": 349744398, "dataset_size": 347282073}, "configs": [{"config_name": "default", "data_files": [{"split": "v2_short", "path": "data/v2_short-*"}]}]} | 2024-01-11T16:40:13+00:00 | [] | [] | TAGS
#region-us
|
This is a copy of DRAL-2.0 dataset (URL with only short fragments and only those where the audio on the left is original.
| [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
3ff7c52c50ee7ca1157bb2b09b4d701e564a160e |
# Dataset Card for Evaluation run of upstage/SOLAR-10.7B-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [upstage/SOLAR-10.7B-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__SOLAR-10.7B-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T16:09:54.285787](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__SOLAR-10.7B-v1.0/blob/main/results_2023-12-13T16-09-54.285787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6550126565893573,
"acc_stderr": 0.031671818397293244,
"acc_norm": 0.6574305406535094,
"acc_norm_stderr": 0.03231451555422857,
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4503940613910581,
"mc2_stderr": 0.014223702771748893
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398324,
"acc_norm": 0.6194539249146758,
"acc_norm_stderr": 0.014188277712349812
},
"harness|hellaswag|10": {
"acc": 0.6542521410077674,
"acc_stderr": 0.0047463946133845395,
"acc_norm": 0.8460466042620992,
"acc_norm_stderr": 0.0036016648387189156
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.025467149045469536,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.025467149045469536
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695482995,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695482995
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8434343434343434,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.8434343434343434,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9222797927461139,
"acc_stderr": 0.019321805557223154,
"acc_norm": 0.9222797927461139,
"acc_norm_stderr": 0.019321805557223154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635474,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635474
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465715,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465715
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092434,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250454,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250454
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8270042194092827,
"acc_stderr": 0.02462156286676842,
"acc_norm": 0.8270042194092827,
"acc_norm_stderr": 0.02462156286676842
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7130044843049327,
"acc_stderr": 0.030360379710291947,
"acc_norm": 0.7130044843049327,
"acc_norm_stderr": 0.030360379710291947
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281382,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281382
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265023,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265023
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25027932960893856,
"acc_stderr": 0.01448750085285042,
"acc_norm": 0.25027932960893856,
"acc_norm_stderr": 0.01448750085285042
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7654320987654321,
"acc_stderr": 0.023576881744005716,
"acc_norm": 0.7654320987654321,
"acc_norm_stderr": 0.023576881744005716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.49608865710560623,
"acc_stderr": 0.012769845366441192,
"acc_norm": 0.49608865710560623,
"acc_norm_stderr": 0.012769845366441192
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.026799562024887664,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.026799562024887664
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.018433427649401903,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.018433427649401903
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7714285714285715,
"acc_stderr": 0.026882144922307744,
"acc_norm": 0.7714285714285715,
"acc_norm_stderr": 0.026882144922307744
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8756218905472637,
"acc_stderr": 0.023335401790166327,
"acc_norm": 0.8756218905472637,
"acc_norm_stderr": 0.023335401790166327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.028782108105401705,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.028782108105401705
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3047735618115055,
"mc1_stderr": 0.016114124156882455,
"mc2": 0.4503940613910581,
"mc2_stderr": 0.014223702771748893
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273764
},
"harness|gsm8k|5": {
"acc": 0.5549658832448825,
"acc_stderr": 0.013689011567414202
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_upstage__SOLAR-10.7B-v1.0 | [
"region:us"
] | 2023-12-13T16:08:48+00:00 | {"pretty_name": "Evaluation run of upstage/SOLAR-10.7B-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [upstage/SOLAR-10.7B-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__SOLAR-10.7B-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T16:09:54.285787](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__SOLAR-10.7B-v1.0/blob/main/results_2023-12-13T16-09-54.285787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6550126565893573,\n \"acc_stderr\": 0.031671818397293244,\n \"acc_norm\": 0.6574305406535094,\n \"acc_norm_stderr\": 0.03231451555422857,\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4503940613910581,\n \"mc2_stderr\": 0.014223702771748893\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398324,\n \"acc_norm\": 0.6194539249146758,\n \"acc_norm_stderr\": 0.014188277712349812\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6542521410077674,\n \"acc_stderr\": 0.0047463946133845395,\n \"acc_norm\": 0.8460466042620992,\n \"acc_norm_stderr\": 0.0036016648387189156\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n \"acc_stderr\": 0.025467149045469536,\n \"acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.025467149045469536\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695482995,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695482995\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8434343434343434,\n \"acc_stderr\": 0.025890520358141454,\n \"acc_norm\": 0.8434343434343434,\n \"acc_norm_stderr\": 0.025890520358141454\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9222797927461139,\n \"acc_stderr\": 0.019321805557223154,\n \"acc_norm\": 0.9222797927461139,\n \"acc_norm_stderr\": 0.019321805557223154\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635474,\n \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635474\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465715,\n \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465715\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092434,\n \"acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092434\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.033247089118091176,\n \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.033247089118091176\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8270042194092827,\n \"acc_stderr\": 0.02462156286676842,\n \"acc_norm\": 0.8270042194092827,\n \"acc_norm_stderr\": 0.02462156286676842\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7130044843049327,\n \"acc_stderr\": 0.030360379710291947,\n \"acc_norm\": 0.7130044843049327,\n \"acc_norm_stderr\": 0.030360379710291947\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281382,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281382\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265023,\n \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265023\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25027932960893856,\n \"acc_stderr\": 0.01448750085285042,\n \"acc_norm\": 0.25027932960893856,\n \"acc_norm_stderr\": 0.01448750085285042\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.02342037547829613,\n \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.02342037547829613\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7654320987654321,\n \"acc_stderr\": 0.023576881744005716,\n \"acc_norm\": 0.7654320987654321,\n \"acc_norm_stderr\": 0.023576881744005716\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.49608865710560623,\n \"acc_stderr\": 0.012769845366441192,\n \"acc_norm\": 0.49608865710560623,\n \"acc_norm_stderr\": 0.012769845366441192\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.026799562024887664,\n \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.026799562024887664\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.018433427649401903,\n \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.018433427649401903\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7714285714285715,\n \"acc_stderr\": 0.026882144922307744,\n \"acc_norm\": 0.7714285714285715,\n \"acc_norm_stderr\": 0.026882144922307744\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8756218905472637,\n \"acc_stderr\": 0.023335401790166327,\n \"acc_norm\": 0.8756218905472637,\n \"acc_norm_stderr\": 0.023335401790166327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.028782108105401705,\n \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.028782108105401705\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3047735618115055,\n \"mc1_stderr\": 0.016114124156882455,\n \"mc2\": 0.4503940613910581,\n \"mc2_stderr\": 0.014223702771748893\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273764\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5549658832448825,\n \"acc_stderr\": 0.013689011567414202\n }\n}\n```", "repo_url": "https://huggingface.co/upstage/SOLAR-10.7B-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|arc:challenge|25_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|arc:challenge|25_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|gsm8k|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|gsm8k|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hellaswag|10_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hellaswag|10_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T16-05-57.212237.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T16-09-54.285787.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["**/details_harness|winogrande|5_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["**/details_harness|winogrande|5_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T16-09-54.285787.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T16_05_57.212237", "path": ["results_2023-12-13T16-05-57.212237.parquet"]}, {"split": "2023_12_13T16_09_54.285787", "path": ["results_2023-12-13T16-09-54.285787.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T16-09-54.285787.parquet"]}]}]} | 2023-12-13T16:12:55+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of upstage/SOLAR-10.7B-v1.0
Dataset automatically created during the evaluation run of model upstage/SOLAR-10.7B-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T16:09:54.285787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of upstage/SOLAR-10.7B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model upstage/SOLAR-10.7B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T16:09:54.285787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of upstage/SOLAR-10.7B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model upstage/SOLAR-10.7B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T16:09:54.285787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
187,
66,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of upstage/SOLAR-10.7B-v1.0\n\n\n\nDataset automatically created during the evaluation run of model upstage/SOLAR-10.7B-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T16:09:54.285787(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]## Dataset Card Contact"
] |
d1bc0aa6ff9dfd94ab0edfbfcb6c6c58ea5b4350 |
## Dataset Card for Anthropic_HH_Golden_Formatted
As per the original dataset: `This dataset is constructed to test the **ULMA** technique as mentioned in the paper
*Unified Language Model Alignment with Demonstration and Point-wise Human Preference*. They show that replacing the
positive samples in a preference dataset by high-quality demonstration data (golden data) greatly improves the
performance of various alignment methods (RLHF, DPO, ULMA). In particular, the ULMA method exploits the high-quality
demonstration data in the preference dataset by treating the positive and negative samples differently, and boosting
the performance by removing the KL regularizer for positive samples.`
For more information please see the original dataset at [Unified-Language-Model-Alignment/Anthropic_HH_Golden](https://huggingface.co/datasets/Unified-Language-Model-Alignment/Anthropic_HH_Golden).
### Formatting
Since the [Unified-Language-Model-Alignment/Anthropic_HH_Golden](https://huggingface.co/datasets/Unified-Language-Model-Alignment/Anthropic_HH_Golden) comes
in raw format, in order to ease the usage of this dataset, the following formatting has been applied:
* Separate `prompt` from `chosen` and `rejected` columns to have an overview of the prompts, as those are shared by both `chosen` and `rejected`
within the same rows.
* Add a `prompt_id` which is a SHA-256 encoding of the `prompt`
* Turn the raw conversations in `chosen` and `rejected` from `Human: ... Assistant: ... ...` to a chat-compliant format as a list of `{"role": "user" | "assistant", "content": "..."}`
Also note that using this format leads to a way better integration with [`huggingface/alignment-handbook](https://github.com/huggingface/alignment-handbook), providing an
straight forward way to fine-tune 7B LLMs using DPO, thanks to the awesome work done by [HuggingFaceH4](https://huggingface.co/HuggingFaceH4).
### Usage
Use it directly via 🤗`datasets`:
```python
from datasets import load_dataset
dataset = load_dataset("alvarobartt/Anthropic_HH_Golden_Formatted")
```
### Disclaimer
This dataset is only a copy of the original one, but with a clearer and easy to use formatting, but all
the credits go to the original authors at [Unified-Language-Model-Alignment](https://huggingface.co/Unified-Language-Model-Alignment). | alvarobartt/Anthropic_HH_Golden_Formatted | [
"task_categories:conversational",
"size_categories:10K<n<100K",
"language:en",
"license:apache-2.0",
"not-for-all-audiences",
"region:us"
] | 2023-12-13T16:32:59+00:00 | {"language": ["en"], "license": "apache-2.0", "size_categories": ["10K<n<100K"], "task_categories": ["conversational"], "pretty_name": "Anthropic HH Golden Formatted", "dataset_info": {"features": [{"name": "prompt_id", "dtype": "string"}, {"name": "prompt", "dtype": "string"}, {"name": "chosen", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}, {"name": "rejected", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 65325008, "num_examples": 42537}, {"name": "test", "num_bytes": 3651096, "num_examples": 2312}], "download_size": 39481598, "dataset_size": 68976104}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "tags": ["not-for-all-audiences"]} | 2023-12-14T10:01:59+00:00 | [] | [
"en"
] | TAGS
#task_categories-conversational #size_categories-10K<n<100K #language-English #license-apache-2.0 #not-for-all-audiences #region-us
|
## Dataset Card for Anthropic_HH_Golden_Formatted
As per the original dataset: 'This dataset is constructed to test the ULMA technique as mentioned in the paper
*Unified Language Model Alignment with Demonstration and Point-wise Human Preference*. They show that replacing the
positive samples in a preference dataset by high-quality demonstration data (golden data) greatly improves the
performance of various alignment methods (RLHF, DPO, ULMA). In particular, the ULMA method exploits the high-quality
demonstration data in the preference dataset by treating the positive and negative samples differently, and boosting
the performance by removing the KL regularizer for positive samples.'
For more information please see the original dataset at Unified-Language-Model-Alignment/Anthropic_HH_Golden.
### Formatting
Since the Unified-Language-Model-Alignment/Anthropic_HH_Golden comes
in raw format, in order to ease the usage of this dataset, the following formatting has been applied:
* Separate 'prompt' from 'chosen' and 'rejected' columns to have an overview of the prompts, as those are shared by both 'chosen' and 'rejected'
within the same rows.
* Add a 'prompt_id' which is a SHA-256 encoding of the 'prompt'
* Turn the raw conversations in 'chosen' and 'rejected' from 'Human: ... Assistant: ... ...' to a chat-compliant format as a list of '{"role": "user" | "assistant", "content": "..."}'
Also note that using this format leads to a way better integration with 'huggingface/alignment-handbook, providing an
straight forward way to fine-tune 7B LLMs using DPO, thanks to the awesome work done by HuggingFaceH4.
### Usage
Use it directly via 'datasets':
### Disclaimer
This dataset is only a copy of the original one, but with a clearer and easy to use formatting, but all
the credits go to the original authors at Unified-Language-Model-Alignment. | [
"## Dataset Card for Anthropic_HH_Golden_Formatted\n\nAs per the original dataset: 'This dataset is constructed to test the ULMA technique as mentioned in the paper\n*Unified Language Model Alignment with Demonstration and Point-wise Human Preference*. They show that replacing the\npositive samples in a preference dataset by high-quality demonstration data (golden data) greatly improves the\nperformance of various alignment methods (RLHF, DPO, ULMA). In particular, the ULMA method exploits the high-quality\ndemonstration data in the preference dataset by treating the positive and negative samples differently, and boosting\nthe performance by removing the KL regularizer for positive samples.'\n\nFor more information please see the original dataset at Unified-Language-Model-Alignment/Anthropic_HH_Golden.",
"### Formatting\n\nSince the Unified-Language-Model-Alignment/Anthropic_HH_Golden comes\nin raw format, in order to ease the usage of this dataset, the following formatting has been applied:\n\n* Separate 'prompt' from 'chosen' and 'rejected' columns to have an overview of the prompts, as those are shared by both 'chosen' and 'rejected'\nwithin the same rows.\n* Add a 'prompt_id' which is a SHA-256 encoding of the 'prompt'\n* Turn the raw conversations in 'chosen' and 'rejected' from 'Human: ... Assistant: ... ...' to a chat-compliant format as a list of '{\"role\": \"user\" | \"assistant\", \"content\": \"...\"}'\n\nAlso note that using this format leads to a way better integration with 'huggingface/alignment-handbook, providing an\nstraight forward way to fine-tune 7B LLMs using DPO, thanks to the awesome work done by HuggingFaceH4.",
"### Usage\n\nUse it directly via 'datasets':",
"### Disclaimer\n\nThis dataset is only a copy of the original one, but with a clearer and easy to use formatting, but all\nthe credits go to the original authors at Unified-Language-Model-Alignment."
] | [
"TAGS\n#task_categories-conversational #size_categories-10K<n<100K #language-English #license-apache-2.0 #not-for-all-audiences #region-us \n",
"## Dataset Card for Anthropic_HH_Golden_Formatted\n\nAs per the original dataset: 'This dataset is constructed to test the ULMA technique as mentioned in the paper\n*Unified Language Model Alignment with Demonstration and Point-wise Human Preference*. They show that replacing the\npositive samples in a preference dataset by high-quality demonstration data (golden data) greatly improves the\nperformance of various alignment methods (RLHF, DPO, ULMA). In particular, the ULMA method exploits the high-quality\ndemonstration data in the preference dataset by treating the positive and negative samples differently, and boosting\nthe performance by removing the KL regularizer for positive samples.'\n\nFor more information please see the original dataset at Unified-Language-Model-Alignment/Anthropic_HH_Golden.",
"### Formatting\n\nSince the Unified-Language-Model-Alignment/Anthropic_HH_Golden comes\nin raw format, in order to ease the usage of this dataset, the following formatting has been applied:\n\n* Separate 'prompt' from 'chosen' and 'rejected' columns to have an overview of the prompts, as those are shared by both 'chosen' and 'rejected'\nwithin the same rows.\n* Add a 'prompt_id' which is a SHA-256 encoding of the 'prompt'\n* Turn the raw conversations in 'chosen' and 'rejected' from 'Human: ... Assistant: ... ...' to a chat-compliant format as a list of '{\"role\": \"user\" | \"assistant\", \"content\": \"...\"}'\n\nAlso note that using this format leads to a way better integration with 'huggingface/alignment-handbook, providing an\nstraight forward way to fine-tune 7B LLMs using DPO, thanks to the awesome work done by HuggingFaceH4.",
"### Usage\n\nUse it directly via 'datasets':",
"### Disclaimer\n\nThis dataset is only a copy of the original one, but with a clearer and easy to use formatting, but all\nthe credits go to the original authors at Unified-Language-Model-Alignment."
] | [
49,
196,
252,
14,
53
] | [
"passage: TAGS\n#task_categories-conversational #size_categories-10K<n<100K #language-English #license-apache-2.0 #not-for-all-audiences #region-us \n## Dataset Card for Anthropic_HH_Golden_Formatted\n\nAs per the original dataset: 'This dataset is constructed to test the ULMA technique as mentioned in the paper\n*Unified Language Model Alignment with Demonstration and Point-wise Human Preference*. They show that replacing the\npositive samples in a preference dataset by high-quality demonstration data (golden data) greatly improves the\nperformance of various alignment methods (RLHF, DPO, ULMA). In particular, the ULMA method exploits the high-quality\ndemonstration data in the preference dataset by treating the positive and negative samples differently, and boosting\nthe performance by removing the KL regularizer for positive samples.'\n\nFor more information please see the original dataset at Unified-Language-Model-Alignment/Anthropic_HH_Golden.### Formatting\n\nSince the Unified-Language-Model-Alignment/Anthropic_HH_Golden comes\nin raw format, in order to ease the usage of this dataset, the following formatting has been applied:\n\n* Separate 'prompt' from 'chosen' and 'rejected' columns to have an overview of the prompts, as those are shared by both 'chosen' and 'rejected'\nwithin the same rows.\n* Add a 'prompt_id' which is a SHA-256 encoding of the 'prompt'\n* Turn the raw conversations in 'chosen' and 'rejected' from 'Human: ... Assistant: ... ...' to a chat-compliant format as a list of '{\"role\": \"user\" | \"assistant\", \"content\": \"...\"}'\n\nAlso note that using this format leads to a way better integration with 'huggingface/alignment-handbook, providing an\nstraight forward way to fine-tune 7B LLMs using DPO, thanks to the awesome work done by HuggingFaceH4."
] |
a4911eeddbeb794b58e9f8aaae6786ff8d4668fe | With my baby always I'm so happy this book is not enough to write or express how much happy today its been more than years but still that bond remain same
Today 11/11/2023 it's around 6:40 pm in the evening one particular day I cannot consider as my favorite day because with my baby each and every moment is very special & precious at the same time we enjoy a lot. Hanging out with my buddies it's common and special.
Me & dhee was reading the book name called "namma Bengaluru" and suddenly I planned to go V V puram (food street)
One of our favorite spot me, dhee my boyfriend & grace went V V puram
We have reached around 9:30pm
First we went dosa corner & we had dosa dhee's favorite avarebele dose & cheese masala dose then explored whole street & we had snacks like potato twisters,pizza, corn etc. Then while returning dhee purchased some Keychains & my bf purchased long woolen beanie cap we took nice photos & seriously that was a amazing day chilled weather, cool breeze, night 🌙 road, street full of lights with your most favorite people a long walk you just can't imagine how beautiful it was | DJ7/DJ7 | [
"region:us"
] | 2023-12-13T16:47:50+00:00 | {} | 2023-12-13T16:48:22+00:00 | [] | [] | TAGS
#region-us
| With my baby always I'm so happy this book is not enough to write or express how much happy today its been more than years but still that bond remain same
Today 11/11/2023 it's around 6:40 pm in the evening one particular day I cannot consider as my favorite day because with my baby each and every moment is very special & precious at the same time we enjoy a lot. Hanging out with my buddies it's common and special.
Me & dhee was reading the book name called "namma Bengaluru" and suddenly I planned to go V V puram (food street)
One of our favorite spot me, dhee my boyfriend & grace went V V puram
We have reached around 9:30pm
First we went dosa corner & we had dosa dhee's favorite avarebele dose & cheese masala dose then explored whole street & we had snacks like potato twisters,pizza, corn etc. Then while returning dhee purchased some Keychains & my bf purchased long woolen beanie cap we took nice photos & seriously that was a amazing day chilled weather, cool breeze, night road, street full of lights with your most favorite people a long walk you just can't imagine how beautiful it was | [] | [
"TAGS\n#region-us \n"
] | [
6
] | [
"passage: TAGS\n#region-us \n"
] |
d77276de580aed6ec6ca014f70353ed71d9fc102 | # Dataset Card for InstrucatPLUS
## Dataset Description
### Dataset Summary
InstruCat is a dataset consisting of 235318 instructions in Catalan, plus dolly_en (English) 11k and a custom Spanish instructional one (10k) to avoid catastrophic forgetfullness for the Flor multilingual model.
### Dataset contains data converted to instructions format from the following datasets:
- caBreu : The instructions were created in form of summarization tasks. There are 2 types of summarization categories in the dataset: extreme and abstractive. The extreme one summarizes text into one sentence and the abstractive into shorter texts around 3-5 sentences.
- CaSERa : The instructions were created in form of multilabel classification tasks, where the labels are a given set of emotions present or absent in the texts.
- CatalanQA : The instructions correspond to questions in CatalanQA.
- CaWikiTC : The instructions were created in 2 different ways of text classification tasks with the distribution 70% - 30%. The first way is to define a category of a given text. The second way is to answer where a given text belongs to a certain category in a form of alternative question.
- ceil : The instructions were created in 2 different ways of Named Entity Recognition tasks with the distribution 70% - 30%. The first way is to list all the found Named Entities. The second way is to list only Named Entities of a particular category.
- CoqCat : The instructions correspond to the first questions of CoqCat conversations.
- GuiaCat : The instructions were created in form of sentiment analysis tasks.
- IntoxiCat : The instructions were created in form of binary classification tasks. The task is to define wether a given text is toxic or no.
- NLUCat : The instructions were created in form of phrase generation tasks to express a given intent.
- Parafraseja : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.
- PAWS-ca : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.
- sts-ca : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.
- teca : The instructions were created in 2 different ways with the distribution 70% - 30%. The first way is in form of entailment generation tasks. The second way is to define whether one given text is an entailment of another given text.
- WikiCat : The instructions were created in 2 different ways of text classification tasks with the distribution 70% - 30%. The first way is to define a category of a given text. The second way is to answer where a given text belongs to a certain category in a form of alternative question.
## Dataset Structure
#### Data Splits
- train.jsonl: 178044 instructions
- validation.jsonl: 28125 instructions
- test.jsonl: 29149 instructions
### Data Instances
Three JSONL files, one for each split.
An example of 'test' looks as follows:
```
{
"ID": "Parafraseja_8977",
"instruction": "Reescriu aquesta frase sense alterar-ne el significat:",
"context": "Es tracta d'un tipus que ens falla ja que a ell li falla aquesta falta d'interès per tal d'exercir el domini sobre l'ambient.",
"response": "Es tracta d'un tipus que ens falla perquè a ell li falla aquesta falta d'interès per exercir el domini sobre l'ambient.",
"category": "paraphrasis"
}
```
### Category Distibution
| Category | Number of instructions |% |
|----------------|----------|------ |
| ner | 59410 | 25.24% |
| paraphrasis | 34695 | 14.74% |
| text_classification | 33393 | 14.19% |
| toxicity | 29809 | 12.66% |
| qa | 27427 | 11.65% |
| emotion_detection | 18492 | 7.85% |
| phrase_generation | 11873 | 5.04% |
| entailment_generation | 6354 | 2.70% |
| sentiment_analysis | 5750 | 2.44% |
| abstractive_summarization | 2999 | 1.27% |
| extreme_summarization | 2999 | 1.27% |
| entailment | 2117 | 0.89% | | projecte-aina/InstruCatPlus | [
"language:ca",
"region:us"
] | 2023-12-13T16:59:32+00:00 | {"language": ["ca"]} | 2023-12-21T15:47:43+00:00 | [] | [
"ca"
] | TAGS
#language-Catalan #region-us
| Dataset Card for InstrucatPLUS
==============================
Dataset Description
-------------------
### Dataset Summary
InstruCat is a dataset consisting of 235318 instructions in Catalan, plus dolly\_en (English) 11k and a custom Spanish instructional one (10k) to avoid catastrophic forgetfullness for the Flor multilingual model.
### Dataset contains data converted to instructions format from the following datasets:
* caBreu : The instructions were created in form of summarization tasks. There are 2 types of summarization categories in the dataset: extreme and abstractive. The extreme one summarizes text into one sentence and the abstractive into shorter texts around 3-5 sentences.
* CaSERa : The instructions were created in form of multilabel classification tasks, where the labels are a given set of emotions present or absent in the texts.
* CatalanQA : The instructions correspond to questions in CatalanQA.
* CaWikiTC : The instructions were created in 2 different ways of text classification tasks with the distribution 70% - 30%. The first way is to define a category of a given text. The second way is to answer where a given text belongs to a certain category in a form of alternative question.
* ceil : The instructions were created in 2 different ways of Named Entity Recognition tasks with the distribution 70% - 30%. The first way is to list all the found Named Entities. The second way is to list only Named Entities of a particular category.
* CoqCat : The instructions correspond to the first questions of CoqCat conversations.
* GuiaCat : The instructions were created in form of sentiment analysis tasks.
* IntoxiCat : The instructions were created in form of binary classification tasks. The task is to define wether a given text is toxic or no.
* NLUCat : The instructions were created in form of phrase generation tasks to express a given intent.
* Parafraseja : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.
* PAWS-ca : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.
* sts-ca : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.
* teca : The instructions were created in 2 different ways with the distribution 70% - 30%. The first way is in form of entailment generation tasks. The second way is to define whether one given text is an entailment of another given text.
* WikiCat : The instructions were created in 2 different ways of text classification tasks with the distribution 70% - 30%. The first way is to define a category of a given text. The second way is to answer where a given text belongs to a certain category in a form of alternative question.
Dataset Structure
-----------------
#### Data Splits
* URL: 178044 instructions
* URL: 28125 instructions
* URL: 29149 instructions
### Data Instances
Three JSONL files, one for each split.
An example of 'test' looks as follows:
### Category Distibution
Category: ner, Number of instructions: 59410, %: 25.24%
Category: paraphrasis, Number of instructions: 34695, %: 14.74%
Category: text\_classification, Number of instructions: 33393, %: 14.19%
Category: toxicity, Number of instructions: 29809, %: 12.66%
Category: qa, Number of instructions: 27427, %: 11.65%
Category: emotion\_detection, Number of instructions: 18492, %: 7.85%
Category: phrase\_generation, Number of instructions: 11873, %: 5.04%
Category: entailment\_generation, Number of instructions: 6354, %: 2.70%
Category: sentiment\_analysis, Number of instructions: 5750, %: 2.44%
Category: abstractive\_summarization, Number of instructions: 2999, %: 1.27%
Category: extreme\_summarization, Number of instructions: 2999, %: 1.27%
Category: entailment, Number of instructions: 2117, %: 0.89%
| [
"### Dataset Summary\n\n\nInstruCat is a dataset consisting of 235318 instructions in Catalan, plus dolly\\_en (English) 11k and a custom Spanish instructional one (10k) to avoid catastrophic forgetfullness for the Flor multilingual model.",
"### Dataset contains data converted to instructions format from the following datasets:\n\n\n* caBreu : The instructions were created in form of summarization tasks. There are 2 types of summarization categories in the dataset: extreme and abstractive. The extreme one summarizes text into one sentence and the abstractive into shorter texts around 3-5 sentences.\n* CaSERa : The instructions were created in form of multilabel classification tasks, where the labels are a given set of emotions present or absent in the texts.\n* CatalanQA : The instructions correspond to questions in CatalanQA.\n* CaWikiTC : The instructions were created in 2 different ways of text classification tasks with the distribution 70% - 30%. The first way is to define a category of a given text. The second way is to answer where a given text belongs to a certain category in a form of alternative question.\n* ceil : The instructions were created in 2 different ways of Named Entity Recognition tasks with the distribution 70% - 30%. The first way is to list all the found Named Entities. The second way is to list only Named Entities of a particular category.\n* CoqCat : The instructions correspond to the first questions of CoqCat conversations.\n* GuiaCat : The instructions were created in form of sentiment analysis tasks.\n* IntoxiCat : The instructions were created in form of binary classification tasks. The task is to define wether a given text is toxic or no.\n* NLUCat : The instructions were created in form of phrase generation tasks to express a given intent.\n* Parafraseja : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.\n* PAWS-ca : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.\n* sts-ca : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.\n* teca : The instructions were created in 2 different ways with the distribution 70% - 30%. The first way is in form of entailment generation tasks. The second way is to define whether one given text is an entailment of another given text.\n* WikiCat : The instructions were created in 2 different ways of text classification tasks with the distribution 70% - 30%. The first way is to define a category of a given text. The second way is to answer where a given text belongs to a certain category in a form of alternative question.\n\n\nDataset Structure\n-----------------",
"#### Data Splits\n\n\n* URL: 178044 instructions\n* URL: 28125 instructions\n* URL: 29149 instructions",
"### Data Instances\n\n\nThree JSONL files, one for each split.\n\n\nAn example of 'test' looks as follows:",
"### Category Distibution\n\n\nCategory: ner, Number of instructions: 59410, %: 25.24%\nCategory: paraphrasis, Number of instructions: 34695, %: 14.74%\nCategory: text\\_classification, Number of instructions: 33393, %: 14.19%\nCategory: toxicity, Number of instructions: 29809, %: 12.66%\nCategory: qa, Number of instructions: 27427, %: 11.65%\nCategory: emotion\\_detection, Number of instructions: 18492, %: 7.85%\nCategory: phrase\\_generation, Number of instructions: 11873, %: 5.04%\nCategory: entailment\\_generation, Number of instructions: 6354, %: 2.70%\nCategory: sentiment\\_analysis, Number of instructions: 5750, %: 2.44%\nCategory: abstractive\\_summarization, Number of instructions: 2999, %: 1.27%\nCategory: extreme\\_summarization, Number of instructions: 2999, %: 1.27%\nCategory: entailment, Number of instructions: 2117, %: 0.89%"
] | [
"TAGS\n#language-Catalan #region-us \n",
"### Dataset Summary\n\n\nInstruCat is a dataset consisting of 235318 instructions in Catalan, plus dolly\\_en (English) 11k and a custom Spanish instructional one (10k) to avoid catastrophic forgetfullness for the Flor multilingual model.",
"### Dataset contains data converted to instructions format from the following datasets:\n\n\n* caBreu : The instructions were created in form of summarization tasks. There are 2 types of summarization categories in the dataset: extreme and abstractive. The extreme one summarizes text into one sentence and the abstractive into shorter texts around 3-5 sentences.\n* CaSERa : The instructions were created in form of multilabel classification tasks, where the labels are a given set of emotions present or absent in the texts.\n* CatalanQA : The instructions correspond to questions in CatalanQA.\n* CaWikiTC : The instructions were created in 2 different ways of text classification tasks with the distribution 70% - 30%. The first way is to define a category of a given text. The second way is to answer where a given text belongs to a certain category in a form of alternative question.\n* ceil : The instructions were created in 2 different ways of Named Entity Recognition tasks with the distribution 70% - 30%. The first way is to list all the found Named Entities. The second way is to list only Named Entities of a particular category.\n* CoqCat : The instructions correspond to the first questions of CoqCat conversations.\n* GuiaCat : The instructions were created in form of sentiment analysis tasks.\n* IntoxiCat : The instructions were created in form of binary classification tasks. The task is to define wether a given text is toxic or no.\n* NLUCat : The instructions were created in form of phrase generation tasks to express a given intent.\n* Parafraseja : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.\n* PAWS-ca : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.\n* sts-ca : The instructions were created in form of text generation tasks. The task is to generate a text equivalent by meaning to a given text.\n* teca : The instructions were created in 2 different ways with the distribution 70% - 30%. The first way is in form of entailment generation tasks. The second way is to define whether one given text is an entailment of another given text.\n* WikiCat : The instructions were created in 2 different ways of text classification tasks with the distribution 70% - 30%. The first way is to define a category of a given text. The second way is to answer where a given text belongs to a certain category in a form of alternative question.\n\n\nDataset Structure\n-----------------",
"#### Data Splits\n\n\n* URL: 178044 instructions\n* URL: 28125 instructions\n* URL: 29149 instructions",
"### Data Instances\n\n\nThree JSONL files, one for each split.\n\n\nAn example of 'test' looks as follows:",
"### Category Distibution\n\n\nCategory: ner, Number of instructions: 59410, %: 25.24%\nCategory: paraphrasis, Number of instructions: 34695, %: 14.74%\nCategory: text\\_classification, Number of instructions: 33393, %: 14.19%\nCategory: toxicity, Number of instructions: 29809, %: 12.66%\nCategory: qa, Number of instructions: 27427, %: 11.65%\nCategory: emotion\\_detection, Number of instructions: 18492, %: 7.85%\nCategory: phrase\\_generation, Number of instructions: 11873, %: 5.04%\nCategory: entailment\\_generation, Number of instructions: 6354, %: 2.70%\nCategory: sentiment\\_analysis, Number of instructions: 5750, %: 2.44%\nCategory: abstractive\\_summarization, Number of instructions: 2999, %: 1.27%\nCategory: extreme\\_summarization, Number of instructions: 2999, %: 1.27%\nCategory: entailment, Number of instructions: 2117, %: 0.89%"
] | [
11,
61,
568,
23,
28,
235
] | [
"passage: TAGS\n#language-Catalan #region-us \n### Dataset Summary\n\n\nInstruCat is a dataset consisting of 235318 instructions in Catalan, plus dolly\\_en (English) 11k and a custom Spanish instructional one (10k) to avoid catastrophic forgetfullness for the Flor multilingual model."
] |
a234ba0d39efb29028c53803e9f365ff15b661c3 |
# *WorldFloodsv2* dataset
This repository contains the *WorldFloodsv2* dataset released with the publication:
> E. Portalés-Julià, G. Mateo-García, C. Purcell, and L. Gómez-Chova [Global flood extent segmentation in optical satellite images](https://www.nature.com/articles/s41598-023-47595-7). _Scientific Reports 13, 20316_ (2023). DOI: 10.1038/s41598-023-47595-7.
The [*WorldFloodsv2* database](https://www.nature.com/articles/s41598-023-47595-7) contains 509 pairs of Sentinel-2 images and flood segmentation masks. Splitted in train, val and test sets.
It requires approximately 76GB of hard-disk storage.
<img src="worldfloods_v2.png" alt="licence" width="65%"/>
## Download the dataset
```
huggingface-cli download --cache-dir /path/to/cachedir --local-dir /path/to/localdir/WorldFloodsv2 --repo-type dataset isp-uv-es/WorldFloodsv2
```
## Explore the dataset
The [exploring *WorldFloodsv2*](https://spaceml-org.github.io/ml4floods/content/prep/exploring_worldfloods.html) tutorial in the [ml4floods](https://github.com/spaceml-org/ml4floods) package shows how to
process the dataset and plot the images and masks.
## Licence
The *WorldFloods* database and all pre-trained models are released under a [Creative Commons non-commercial licence](https://creativecommons.org/licenses/by-nc/4.0/legalcode.txt)
## Cite
If you find this work useful, please cite:
```
@article{portales-julia_global_2023,
title = {Global flood extent segmentation in optical satellite images},
volume = {13},
issn = {2045-2322},
doi = {10.1038/s41598-023-47595-7},
number = {1},
urldate = {2023-11-30},
journal = {Scientific Reports},
author = {Portalés-Julià, Enrique and Mateo-García, Gonzalo and Purcell, Cormac and Gómez-Chova, Luis},
month = nov,
year = {2023},
pages = {20316},
}
```
| isp-uv-es/WorldFloodsv2 | [
"license:cc-by-nc-4.0",
"remote sensing",
"sentinel2",
"landsat",
"floods",
"region:us"
] | 2023-12-13T17:43:27+00:00 | {"license": "cc-by-nc-4.0", "pipeline_tag": "image-segmentation", "tags": ["remote sensing", "sentinel2", "landsat", "floods"]} | 2024-01-10T11:38:21+00:00 | [] | [] | TAGS
#license-cc-by-nc-4.0 #remote sensing #sentinel2 #landsat #floods #region-us
|
# *WorldFloodsv2* dataset
This repository contains the *WorldFloodsv2* dataset released with the publication:
> E. Portalés-Julià, G. Mateo-García, C. Purcell, and L. Gómez-Chova Global flood extent segmentation in optical satellite images. _Scientific Reports 13, 20316_ (2023). DOI: 10.1038/s41598-023-47595-7.
The *WorldFloodsv2* database contains 509 pairs of Sentinel-2 images and flood segmentation masks. Splitted in train, val and test sets.
It requires approximately 76GB of hard-disk storage.
<img src="worldfloods_v2.png" alt="licence" width="65%"/>
## Download the dataset
## Explore the dataset
The exploring *WorldFloodsv2* tutorial in the ml4floods package shows how to
process the dataset and plot the images and masks.
## Licence
The *WorldFloods* database and all pre-trained models are released under a Creative Commons non-commercial licence
## Cite
If you find this work useful, please cite:
| [
"# *WorldFloodsv2* dataset\n\nThis repository contains the *WorldFloodsv2* dataset released with the publication:\n\n> E. Portalés-Julià, G. Mateo-García, C. Purcell, and L. Gómez-Chova Global flood extent segmentation in optical satellite images. _Scientific Reports 13, 20316_ (2023). DOI: 10.1038/s41598-023-47595-7.\n\n\nThe *WorldFloodsv2* database contains 509 pairs of Sentinel-2 images and flood segmentation masks. Splitted in train, val and test sets.\nIt requires approximately 76GB of hard-disk storage. \n\n\n<img src=\"worldfloods_v2.png\" alt=\"licence\" width=\"65%\"/>",
"## Download the dataset",
"## Explore the dataset\n\nThe exploring *WorldFloodsv2* tutorial in the ml4floods package shows how to \nprocess the dataset and plot the images and masks.",
"## Licence\n\nThe *WorldFloods* database and all pre-trained models are released under a Creative Commons non-commercial licence",
"## Cite\n\nIf you find this work useful, please cite:"
] | [
"TAGS\n#license-cc-by-nc-4.0 #remote sensing #sentinel2 #landsat #floods #region-us \n",
"# *WorldFloodsv2* dataset\n\nThis repository contains the *WorldFloodsv2* dataset released with the publication:\n\n> E. Portalés-Julià, G. Mateo-García, C. Purcell, and L. Gómez-Chova Global flood extent segmentation in optical satellite images. _Scientific Reports 13, 20316_ (2023). DOI: 10.1038/s41598-023-47595-7.\n\n\nThe *WorldFloodsv2* database contains 509 pairs of Sentinel-2 images and flood segmentation masks. Splitted in train, val and test sets.\nIt requires approximately 76GB of hard-disk storage. \n\n\n<img src=\"worldfloods_v2.png\" alt=\"licence\" width=\"65%\"/>",
"## Download the dataset",
"## Explore the dataset\n\nThe exploring *WorldFloodsv2* tutorial in the ml4floods package shows how to \nprocess the dataset and plot the images and masks.",
"## Licence\n\nThe *WorldFloods* database and all pre-trained models are released under a Creative Commons non-commercial licence",
"## Cite\n\nIf you find this work useful, please cite:"
] | [
33,
178,
5,
40,
30,
13
] | [
"passage: TAGS\n#license-cc-by-nc-4.0 #remote sensing #sentinel2 #landsat #floods #region-us \n# *WorldFloodsv2* dataset\n\nThis repository contains the *WorldFloodsv2* dataset released with the publication:\n\n> E. Portalés-Julià, G. Mateo-García, C. Purcell, and L. Gómez-Chova Global flood extent segmentation in optical satellite images. _Scientific Reports 13, 20316_ (2023). DOI: 10.1038/s41598-023-47595-7.\n\n\nThe *WorldFloodsv2* database contains 509 pairs of Sentinel-2 images and flood segmentation masks. Splitted in train, val and test sets.\nIt requires approximately 76GB of hard-disk storage. \n\n\n<img src=\"worldfloods_v2.png\" alt=\"licence\" width=\"65%\"/>## Download the dataset## Explore the dataset\n\nThe exploring *WorldFloodsv2* tutorial in the ml4floods package shows how to \nprocess the dataset and plot the images and masks.## Licence\n\nThe *WorldFloods* database and all pre-trained models are released under a Creative Commons non-commercial licence## Cite\n\nIf you find this work useful, please cite:"
] |
d25b21e407a2d22c96681fe3ae700eb2cc0c9bb9 | # Dataset Card for "summeval-de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sproos/summeval-de | [
"region:us"
] | 2023-12-13T17:46:42+00:00 | {"dataset_info": {"features": [{"name": "machine_summaries", "sequence": "string"}, {"name": "human_summaries", "sequence": "string"}, {"name": "relevance", "sequence": "float64"}, {"name": "coherence", "sequence": "float64"}, {"name": "fluency", "sequence": "float64"}, {"name": "consistency", "sequence": "float64"}, {"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1254264, "num_examples": 100}], "download_size": 522770, "dataset_size": 1254264}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-13T22:40:39+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "summeval-de"
More Information needed | [
"# Dataset Card for \"summeval-de\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"summeval-de\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"summeval-de\"\n\nMore Information needed"
] |
6c9e6944ea8ec20291a1488eb762f32a438cd6ae | # Dataset Card for "summeval-fr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sproos/summeval-fr | [
"region:us"
] | 2023-12-13T17:46:54+00:00 | {"dataset_info": {"features": [{"name": "machine_summaries", "sequence": "string"}, {"name": "human_summaries", "sequence": "string"}, {"name": "relevance", "sequence": "float64"}, {"name": "coherence", "sequence": "float64"}, {"name": "fluency", "sequence": "float64"}, {"name": "consistency", "sequence": "float64"}, {"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1276634, "num_examples": 100}], "download_size": 503320, "dataset_size": 1276634}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-13T22:40:46+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "summeval-fr"
More Information needed | [
"# Dataset Card for \"summeval-fr\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"summeval-fr\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"summeval-fr\"\n\nMore Information needed"
] |
806c651ffcc893ee518a2b198fa160d973e2182b | # Dataset Card for "summeval-es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sproos/summeval-es | [
"region:us"
] | 2023-12-13T17:47:07+00:00 | {"dataset_info": {"features": [{"name": "machine_summaries", "sequence": "string"}, {"name": "human_summaries", "sequence": "string"}, {"name": "relevance", "sequence": "float64"}, {"name": "coherence", "sequence": "float64"}, {"name": "fluency", "sequence": "float64"}, {"name": "consistency", "sequence": "float64"}, {"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1232365, "num_examples": 100}], "download_size": 485176, "dataset_size": 1232365}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-13T22:40:42+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "summeval-es"
More Information needed | [
"# Dataset Card for \"summeval-es\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"summeval-es\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"summeval-es\"\n\nMore Information needed"
] |
d773054c462a22de9a3d4088ab49890bba06118f | # Dataset Card for "summeval-tr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sproos/summeval-tr | [
"region:us"
] | 2023-12-13T17:47:14+00:00 | {"dataset_info": {"features": [{"name": "machine_summaries", "sequence": "string"}, {"name": "human_summaries", "sequence": "string"}, {"name": "relevance", "sequence": "float64"}, {"name": "coherence", "sequence": "float64"}, {"name": "fluency", "sequence": "float64"}, {"name": "consistency", "sequence": "float64"}, {"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1173088, "num_examples": 100}], "download_size": 506357, "dataset_size": 1173088}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-13T22:40:50+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "summeval-tr"
More Information needed | [
"# Dataset Card for \"summeval-tr\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"summeval-tr\"\n\nMore Information needed"
] | [
6,
15
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"summeval-tr\"\n\nMore Information needed"
] |
4eb1c5a78fa8f0e9236c1c969d399c3834944e1e | # Dataset Card for "summeval-sw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | sproos/summeval-sw | [
"region:us"
] | 2023-12-13T17:47:26+00:00 | {"dataset_info": {"features": [{"name": "machine_summaries", "sequence": "string"}, {"name": "human_summaries", "sequence": "string"}, {"name": "relevance", "sequence": "float64"}, {"name": "coherence", "sequence": "float64"}, {"name": "fluency", "sequence": "float64"}, {"name": "consistency", "sequence": "float64"}, {"name": "text", "dtype": "string"}, {"name": "id", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 1172881, "num_examples": 100}], "download_size": 484750, "dataset_size": 1172881}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-13T22:40:55+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "summeval-sw"
More Information needed | [
"# Dataset Card for \"summeval-sw\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"summeval-sw\"\n\nMore Information needed"
] | [
6,
16
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"summeval-sw\"\n\nMore Information needed"
] |
0eda02cccc8a93734645def496b240c20448776f | # Dataset Card for "indian-medicinal-plants"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | mikehemberger/indian-medicinal-plants | [
"region:us"
] | 2023-12-13T18:20:54+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "label", "dtype": {"class_label": {"names": {"0": "Aloevera", "1": "Amla"}}}}], "splits": [{"name": "train", "num_bytes": 21027703.0, "num_examples": 495}], "download_size": 21028445, "dataset_size": 21027703.0}} | 2023-12-14T11:37:02+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "indian-medicinal-plants"
More Information needed | [
"# Dataset Card for \"indian-medicinal-plants\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"indian-medicinal-plants\"\n\nMore Information needed"
] | [
6,
18
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"indian-medicinal-plants\"\n\nMore Information needed"
] |
4f351ef3ad9c9be97ed20119ee599227c6aefcde |
# Bangumi Image Base of Infinite Stratos
This is the image base of bangumi Infinite Stratos, we detected 39 characters, 4121 images in total. The full dataset is [here](all.zip).
**Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual.** If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| # | Images | Download | Preview 1 | Preview 2 | Preview 3 | Preview 4 | Preview 5 | Preview 6 | Preview 7 | Preview 8 |
|:------|---------:|:---------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|:-------------------------------|
| 0 | 1066 | [Download](0/dataset.zip) |  |  |  |  |  |  |  |  |
| 1 | 68 | [Download](1/dataset.zip) |  |  |  |  |  |  |  |  |
| 2 | 21 | [Download](2/dataset.zip) |  |  |  |  |  |  |  |  |
| 3 | 200 | [Download](3/dataset.zip) |  |  |  |  |  |  |  |  |
| 4 | 98 | [Download](4/dataset.zip) |  |  |  |  |  |  |  |  |
| 5 | 27 | [Download](5/dataset.zip) |  |  |  |  |  |  |  |  |
| 6 | 131 | [Download](6/dataset.zip) |  |  |  |  |  |  |  |  |
| 7 | 247 | [Download](7/dataset.zip) |  |  |  |  |  |  |  |  |
| 8 | 6 | [Download](8/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 9 | 10 | [Download](9/dataset.zip) |  |  |  |  |  |  |  |  |
| 10 | 8 | [Download](10/dataset.zip) |  |  |  |  |  |  |  |  |
| 11 | 416 | [Download](11/dataset.zip) |  |  |  |  |  |  |  |  |
| 12 | 36 | [Download](12/dataset.zip) |  |  |  |  |  |  |  |  |
| 13 | 344 | [Download](13/dataset.zip) |  |  |  |  |  |  |  |  |
| 14 | 16 | [Download](14/dataset.zip) |  |  |  |  |  |  |  |  |
| 15 | 12 | [Download](15/dataset.zip) |  |  |  |  |  |  |  |  |
| 16 | 115 | [Download](16/dataset.zip) |  |  |  |  |  |  |  |  |
| 17 | 15 | [Download](17/dataset.zip) |  |  |  |  |  |  |  |  |
| 18 | 38 | [Download](18/dataset.zip) |  |  |  |  |  |  |  |  |
| 19 | 17 | [Download](19/dataset.zip) |  |  |  |  |  |  |  |  |
| 20 | 18 | [Download](20/dataset.zip) |  |  |  |  |  |  |  |  |
| 21 | 15 | [Download](21/dataset.zip) |  |  |  |  |  |  |  |  |
| 22 | 12 | [Download](22/dataset.zip) |  |  |  |  |  |  |  |  |
| 23 | 17 | [Download](23/dataset.zip) |  |  |  |  |  |  |  |  |
| 24 | 8 | [Download](24/dataset.zip) |  |  |  |  |  |  |  |  |
| 25 | 12 | [Download](25/dataset.zip) |  |  |  |  |  |  |  |  |
| 26 | 394 | [Download](26/dataset.zip) |  |  |  |  |  |  |  |  |
| 27 | 34 | [Download](27/dataset.zip) |  |  |  |  |  |  |  |  |
| 28 | 378 | [Download](28/dataset.zip) |  |  |  |  |  |  |  |  |
| 29 | 19 | [Download](29/dataset.zip) |  |  |  |  |  |  |  |  |
| 30 | 13 | [Download](30/dataset.zip) |  |  |  |  |  |  |  |  |
| 31 | 8 | [Download](31/dataset.zip) |  |  |  |  |  |  |  |  |
| 32 | 6 | [Download](32/dataset.zip) |  |  |  |  |  |  | N/A | N/A |
| 33 | 10 | [Download](33/dataset.zip) |  |  |  |  |  |  |  |  |
| 34 | 10 | [Download](34/dataset.zip) |  |  |  |  |  |  |  |  |
| 35 | 7 | [Download](35/dataset.zip) |  |  |  |  |  |  |  | N/A |
| 36 | 11 | [Download](36/dataset.zip) |  |  |  |  |  |  |  |  |
| 37 | 9 | [Download](37/dataset.zip) |  |  |  |  |  |  |  |  |
| noise | 249 | [Download](-1/dataset.zip) |  |  |  |  |  |  |  |  |
| BangumiBase/infinitestratos | [
"size_categories:1K<n<10K",
"license:mit",
"art",
"region:us"
] | 2023-12-13T18:21:23+00:00 | {"license": "mit", "size_categories": ["1K<n<10K"], "tags": ["art"]} | 2023-12-13T21:20:39+00:00 | [] | [] | TAGS
#size_categories-1K<n<10K #license-mit #art #region-us
| Bangumi Image Base of Infinite Stratos
======================================
This is the image base of bangumi Infinite Stratos, we detected 39 characters, 4121 images in total. The full dataset is here.
Please note that these image bases are not guaranteed to be 100% cleaned, they may be noisy actual. If you intend to manually train models using this dataset, we recommend performing necessary preprocessing on the downloaded dataset to eliminate potential noisy samples (approximately 1% probability).
Here is the characters' preview:
| [] | [
"TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
] | [
25
] | [
"passage: TAGS\n#size_categories-1K<n<10K #license-mit #art #region-us \n"
] |
1eb4a97ead39ff8e2134030cc9dcd18dd78fba08 | # Dataset Card for "videos2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | bvallegc/videos2 | [
"region:us"
] | 2023-12-13T18:42:17+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}], "dataset_info": {"features": [{"name": "video_data", "dtype": "binary"}, {"name": "duration_seconds", "dtype": "float64"}, {"name": "video_path", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 7126214309, "num_examples": 8699}], "download_size": 7111325000, "dataset_size": 7126214309}} | 2023-12-13T18:48:18+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "videos2"
More Information needed | [
"# Dataset Card for \"videos2\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"videos2\"\n\nMore Information needed"
] | [
6,
13
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"videos2\"\n\nMore Information needed"
] |
1e2ea78cb5ddcd277d8114996ccebf19f74e25dd |
# Dataset of cardigan (Arknights)
This is the dataset of cardigan (Arknights), containing 85 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 85 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 207 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 224 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 85 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 85 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 85 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 207 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 207 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 130 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 224 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 224 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
| AppleHarem/cardigan_arknights | [
"task_categories:text-to-image",
"size_categories:n<1K",
"license:mit",
"art",
"not-for-all-audiences",
"region:us"
] | 2023-12-13T19:57:19+00:00 | {"license": "mit", "size_categories": ["n<1K"], "task_categories": ["text-to-image"], "tags": ["art", "not-for-all-audiences"]} | 2023-12-13T19:57:25+00:00 | [] | [] | TAGS
#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us
| Dataset of cardigan (Arknights)
===============================
This is the dataset of cardigan (Arknights), containing 85 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by DeepGHS Team(huggingface organization).
This is a WebUI contains crawlers and other thing: (LittleAppleWebUI)
| [] | [
"TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] | [
44
] | [
"passage: TAGS\n#task_categories-text-to-image #size_categories-n<1K #license-mit #art #not-for-all-audiences #region-us \n"
] |
d3d203deb3a110859217ca0f8a9780751b5779d4 | # Dataset Card for "openassistant-guanaco-chat-format"
Copy of [timdettmers/openassistant-guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco). Modified to use with hugging face chat tempaltes. | habanoz/openassistant-guanaco-chat-format | [
"region:us"
] | 2023-12-13T20:12:05+00:00 | {"configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}, {"split": "test", "path": "data/test-*"}]}], "dataset_info": {"features": [{"name": "conversation", "list": [{"name": "content", "dtype": "string"}, {"name": "role", "dtype": "string"}]}], "splits": [{"name": "train", "num_bytes": 15469605, "num_examples": 9846}, {"name": "test", "num_bytes": 818818, "num_examples": 518}], "download_size": 9432751, "dataset_size": 16288423}} | 2023-12-14T08:47:14+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "openassistant-guanaco-chat-format"
Copy of timdettmers/openassistant-guanaco. Modified to use with hugging face chat tempaltes. | [
"# Dataset Card for \"openassistant-guanaco-chat-format\"\n\nCopy of timdettmers/openassistant-guanaco. Modified to use with hugging face chat tempaltes."
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"openassistant-guanaco-chat-format\"\n\nCopy of timdettmers/openassistant-guanaco. Modified to use with hugging face chat tempaltes."
] | [
6,
46
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"openassistant-guanaco-chat-format\"\n\nCopy of timdettmers/openassistant-guanaco. Modified to use with hugging face chat tempaltes."
] |
61ba3030b7cb636227e8d61f59d2c9904d02b554 | # Dataset Card for "SD_1213"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) | bys2058/SD_1213 | [
"region:us"
] | 2023-12-13T20:27:57+00:00 | {"dataset_info": {"features": [{"name": "image", "dtype": "image"}, {"name": "original_hairmask", "dtype": "image"}, {"name": "result_image", "dtype": "image"}, {"name": "result_hairmask", "dtype": "image"}, {"name": "image_caption", "dtype": "string"}], "splits": [{"name": "train", "num_bytes": 120067863874.361, "num_examples": 69477}], "download_size": 116951622981, "dataset_size": 120067863874.361}, "configs": [{"config_name": "default", "data_files": [{"split": "train", "path": "data/train-*"}]}]} | 2023-12-18T22:38:23+00:00 | [] | [] | TAGS
#region-us
| # Dataset Card for "SD_1213"
More Information needed | [
"# Dataset Card for \"SD_1213\"\n\nMore Information needed"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for \"SD_1213\"\n\nMore Information needed"
] | [
6,
14
] | [
"passage: TAGS\n#region-us \n# Dataset Card for \"SD_1213\"\n\nMore Information needed"
] |
fc6868cae834322a83dfb76407eea8cffe45a1fd |
# Dataset Card for Evaluation run of GreenNode/GreenNodeLM-7B-v1olet
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [GreenNode/GreenNodeLM-7B-v1olet](https://huggingface.co/GreenNode/GreenNodeLM-7B-v1olet) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_GreenNode__GreenNodeLM-7B-v1olet",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T20:30:07.482326](https://huggingface.co/datasets/open-llm-leaderboard/details_GreenNode__GreenNodeLM-7B-v1olet/blob/main/results_2023-12-13T20-30-07.482326.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6411740347675706,
"acc_stderr": 0.03228342039008203,
"acc_norm": 0.6407691161331389,
"acc_norm_stderr": 0.03295002376578124,
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.6907171691355769,
"mc2_stderr": 0.015243695704371275
},
"harness|arc:challenge|25": {
"acc": 0.7030716723549488,
"acc_stderr": 0.013352025976725223,
"acc_norm": 0.7261092150170648,
"acc_norm_stderr": 0.013032004972989503
},
"harness|hellaswag|10": {
"acc": 0.7143995220075682,
"acc_stderr": 0.004507768029590101,
"acc_norm": 0.8770165305715992,
"acc_norm_stderr": 0.0032774703870227257
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.04451807959055328,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.04451807959055328
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554956,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.02432173848460235,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.02432173848460235
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976037,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976037
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947408,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467618,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467618
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.0306365913486998,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.0306365913486998
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.01354741565866226,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.01354741565866226
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4547486033519553,
"acc_stderr": 0.016653875777524006,
"acc_norm": 0.4547486033519553,
"acc_norm_stderr": 0.016653875777524006
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7026143790849673,
"acc_stderr": 0.026173908506718576,
"acc_norm": 0.7026143790849673,
"acc_norm_stderr": 0.026173908506718576
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399662,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399662
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4680573663624511,
"acc_stderr": 0.012744149704869647,
"acc_norm": 0.4680573663624511,
"acc_norm_stderr": 0.012744149704869647
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6654411764705882,
"acc_stderr": 0.0286619962023353,
"acc_norm": 0.6654411764705882,
"acc_norm_stderr": 0.0286619962023353
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696644,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696644
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.6907171691355769,
"mc2_stderr": 0.015243695704371275
},
"harness|winogrande|5": {
"acc": 0.8232044198895028,
"acc_stderr": 0.010721923287918742
},
"harness|gsm8k|5": {
"acc": 0.66868840030326,
"acc_stderr": 0.01296499967968867
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_GreenNode__GreenNodeLM-7B-v1olet | [
"region:us"
] | 2023-12-13T20:32:58+00:00 | {"pretty_name": "Evaluation run of GreenNode/GreenNodeLM-7B-v1olet", "dataset_summary": "Dataset automatically created during the evaluation run of model [GreenNode/GreenNodeLM-7B-v1olet](https://huggingface.co/GreenNode/GreenNodeLM-7B-v1olet) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_GreenNode__GreenNodeLM-7B-v1olet\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T20:30:07.482326](https://huggingface.co/datasets/open-llm-leaderboard/details_GreenNode__GreenNodeLM-7B-v1olet/blob/main/results_2023-12-13T20-30-07.482326.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6411740347675706,\n \"acc_stderr\": 0.03228342039008203,\n \"acc_norm\": 0.6407691161331389,\n \"acc_norm_stderr\": 0.03295002376578124,\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6907171691355769,\n \"mc2_stderr\": 0.015243695704371275\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.7030716723549488,\n \"acc_stderr\": 0.013352025976725223,\n \"acc_norm\": 0.7261092150170648,\n \"acc_norm_stderr\": 0.013032004972989503\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7143995220075682,\n \"acc_stderr\": 0.004507768029590101,\n \"acc_norm\": 0.8770165305715992,\n \"acc_norm_stderr\": 0.0032774703870227257\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n \"acc_stderr\": 0.04451807959055328,\n \"acc_norm\": 0.4523809523809524,\n \"acc_norm_stderr\": 0.04451807959055328\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554956,\n \"acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.02432173848460235,\n \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.02432173848460235\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976037,\n \"acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976037\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947408,\n \"acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947408\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n \"acc_stderr\": 0.0306365913486998,\n \"acc_norm\": 0.7040358744394619,\n \"acc_norm_stderr\": 0.0306365913486998\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.01354741565866226,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.01354741565866226\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4547486033519553,\n \"acc_stderr\": 0.016653875777524006,\n \"acc_norm\": 0.4547486033519553,\n \"acc_norm_stderr\": 0.016653875777524006\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7026143790849673,\n \"acc_stderr\": 0.026173908506718576,\n \"acc_norm\": 0.7026143790849673,\n \"acc_norm_stderr\": 0.026173908506718576\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n \"acc_stderr\": 0.026082700695399662,\n \"acc_norm\": 0.6977491961414791,\n \"acc_norm_stderr\": 0.026082700695399662\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4680573663624511,\n \"acc_stderr\": 0.012744149704869647,\n \"acc_norm\": 0.4680573663624511,\n \"acc_norm_stderr\": 0.012744149704869647\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6654411764705882,\n \"acc_stderr\": 0.0286619962023353,\n \"acc_norm\": 0.6654411764705882,\n \"acc_norm_stderr\": 0.0286619962023353\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696644,\n \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696644\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.6907171691355769,\n \"mc2_stderr\": 0.015243695704371275\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8232044198895028,\n \"acc_stderr\": 0.010721923287918742\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.66868840030326,\n \"acc_stderr\": 0.01296499967968867\n }\n}\n```", "repo_url": "https://huggingface.co/GreenNode/GreenNodeLM-7B-v1olet", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|arc:challenge|25_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|gsm8k|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hellaswag|10_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T20-30-07.482326.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["**/details_harness|winogrande|5_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T20-30-07.482326.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T20_30_07.482326", "path": ["results_2023-12-13T20-30-07.482326.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T20-30-07.482326.parquet"]}]}]} | 2023-12-13T20:33:43+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of GreenNode/GreenNodeLM-7B-v1olet
Dataset automatically created during the evaluation run of model GreenNode/GreenNodeLM-7B-v1olet on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T20:30:07.482326(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of GreenNode/GreenNodeLM-7B-v1olet\n\n\n\nDataset automatically created during the evaluation run of model GreenNode/GreenNodeLM-7B-v1olet on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T20:30:07.482326(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of GreenNode/GreenNodeLM-7B-v1olet\n\n\n\nDataset automatically created during the evaluation run of model GreenNode/GreenNodeLM-7B-v1olet on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T20:30:07.482326(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
191,
68,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of GreenNode/GreenNodeLM-7B-v1olet\n\n\n\nDataset automatically created during the evaluation run of model GreenNode/GreenNodeLM-7B-v1olet on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T20:30:07.482326(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
d5174572f16341ce849854c56ae31e30a5207075 |
# Dataset Card for Evaluation run of viethq188/LeoScorpius-7B-Chat-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [viethq188/LeoScorpius-7B-Chat-DPO](https://huggingface.co/viethq188/LeoScorpius-7B-Chat-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_viethq188__LeoScorpius-7B-Chat-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T20:45:03.306502](https://huggingface.co/datasets/open-llm-leaderboard/details_viethq188__LeoScorpius-7B-Chat-DPO/blob/main/results_2023-12-13T20-45-03.306502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6555497766998586,
"acc_stderr": 0.032028078217227106,
"acc_norm": 0.6555477264990944,
"acc_norm_stderr": 0.032687540203834554,
"mc1": 0.5483476132190942,
"mc1_stderr": 0.01742148030027764,
"mc2": 0.688337819182235,
"mc2_stderr": 0.014926781640311892
},
"harness|arc:challenge|25": {
"acc": 0.6757679180887372,
"acc_stderr": 0.01367881039951882,
"acc_norm": 0.7047781569965871,
"acc_norm_stderr": 0.01332975029338232
},
"harness|hellaswag|10": {
"acc": 0.708424616610237,
"acc_stderr": 0.004535589759202658,
"acc_norm": 0.8797052380003983,
"acc_norm_stderr": 0.003246410721850727
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.02548718714785938,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.02548718714785938
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268542,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268542
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.02366129639396428,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.02366129639396428
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.029723278961476664,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.029723278961476664
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342853,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342853
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.02615686752393104,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.02615686752393104
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8263090676883781,
"acc_stderr": 0.013547415658662257,
"acc_norm": 0.8263090676883781,
"acc_norm_stderr": 0.013547415658662257
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.02418242749657761,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.02418242749657761
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46145251396648046,
"acc_stderr": 0.016672731267552254,
"acc_norm": 0.46145251396648046,
"acc_norm_stderr": 0.016672731267552254
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008564,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008564
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46284224250325945,
"acc_stderr": 0.012734923579532067,
"acc_norm": 0.46284224250325945,
"acc_norm_stderr": 0.012734923579532067
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.0287951855742913,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.0287951855742913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061452,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061452
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5483476132190942,
"mc1_stderr": 0.01742148030027764,
"mc2": 0.688337819182235,
"mc2_stderr": 0.014926781640311892
},
"harness|winogrande|5": {
"acc": 0.8208366219415943,
"acc_stderr": 0.010777949156047986
},
"harness|gsm8k|5": {
"acc": 0.690674753601213,
"acc_stderr": 0.012731710925078145
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_viethq188__LeoScorpius-7B-Chat-DPO | [
"region:us"
] | 2023-12-13T20:47:55+00:00 | {"pretty_name": "Evaluation run of viethq188/LeoScorpius-7B-Chat-DPO", "dataset_summary": "Dataset automatically created during the evaluation run of model [viethq188/LeoScorpius-7B-Chat-DPO](https://huggingface.co/viethq188/LeoScorpius-7B-Chat-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_viethq188__LeoScorpius-7B-Chat-DPO\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T20:45:03.306502](https://huggingface.co/datasets/open-llm-leaderboard/details_viethq188__LeoScorpius-7B-Chat-DPO/blob/main/results_2023-12-13T20-45-03.306502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6555497766998586,\n \"acc_stderr\": 0.032028078217227106,\n \"acc_norm\": 0.6555477264990944,\n \"acc_norm_stderr\": 0.032687540203834554,\n \"mc1\": 0.5483476132190942,\n \"mc1_stderr\": 0.01742148030027764,\n \"mc2\": 0.688337819182235,\n \"mc2_stderr\": 0.014926781640311892\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6757679180887372,\n \"acc_stderr\": 0.01367881039951882,\n \"acc_norm\": 0.7047781569965871,\n \"acc_norm_stderr\": 0.01332975029338232\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.708424616610237,\n \"acc_stderr\": 0.004535589759202658,\n \"acc_norm\": 0.8797052380003983,\n \"acc_norm_stderr\": 0.003246410721850727\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.02548718714785938,\n \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.02548718714785938\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268542,\n \"acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268542\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6794871794871795,\n \"acc_stderr\": 0.02366129639396428,\n \"acc_norm\": 0.6794871794871795,\n \"acc_norm_stderr\": 0.02366129639396428\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.029723278961476664,\n \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.029723278961476664\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342853,\n \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342853\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.033953227263757976,\n \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.033953227263757976\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.02615686752393104,\n \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.02615686752393104\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8263090676883781,\n \"acc_stderr\": 0.013547415658662257,\n \"acc_norm\": 0.8263090676883781,\n \"acc_norm_stderr\": 0.013547415658662257\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.02418242749657761,\n \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.02418242749657761\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46145251396648046,\n \"acc_stderr\": 0.016672731267552254,\n \"acc_norm\": 0.46145251396648046,\n \"acc_norm_stderr\": 0.016672731267552254\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008564,\n \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008564\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46284224250325945,\n \"acc_stderr\": 0.012734923579532067,\n \"acc_norm\": 0.46284224250325945,\n \"acc_norm_stderr\": 0.012734923579532067\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.0287951855742913,\n \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.0287951855742913\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061452,\n \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061452\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5483476132190942,\n \"mc1_stderr\": 0.01742148030027764,\n \"mc2\": 0.688337819182235,\n \"mc2_stderr\": 0.014926781640311892\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8208366219415943,\n \"acc_stderr\": 0.010777949156047986\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.690674753601213,\n \"acc_stderr\": 0.012731710925078145\n }\n}\n```", "repo_url": "https://huggingface.co/viethq188/LeoScorpius-7B-Chat-DPO", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|arc:challenge|25_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|gsm8k|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hellaswag|10_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T20-45-03.306502.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["**/details_harness|winogrande|5_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T20-45-03.306502.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T20_45_03.306502", "path": ["results_2023-12-13T20-45-03.306502.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T20-45-03.306502.parquet"]}]}]} | 2023-12-13T20:48:39+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of viethq188/LeoScorpius-7B-Chat-DPO
Dataset automatically created during the evaluation run of model viethq188/LeoScorpius-7B-Chat-DPO on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T20:45:03.306502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of viethq188/LeoScorpius-7B-Chat-DPO\n\n\n\nDataset automatically created during the evaluation run of model viethq188/LeoScorpius-7B-Chat-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T20:45:03.306502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of viethq188/LeoScorpius-7B-Chat-DPO\n\n\n\nDataset automatically created during the evaluation run of model viethq188/LeoScorpius-7B-Chat-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T20:45:03.306502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
195,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of viethq188/LeoScorpius-7B-Chat-DPO\n\n\n\nDataset automatically created during the evaluation run of model viethq188/LeoScorpius-7B-Chat-DPO on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T20:45:03.306502(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]"
] |
ec0b37295c33d82c832ee8e56d547e3b997ed908 |
# Dataset Card for Evaluation run of upstage/SOLAR-10.7B-Instruct-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_upstage__SOLAR-10.7B-Instruct-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-13T21:02:33.929144](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__SOLAR-10.7B-Instruct-v1.0/blob/main/results_2023-12-13T21-02-33.929144.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6657586984797939,
"acc_stderr": 0.03165995758526614,
"acc_norm": 0.6666511531376961,
"acc_norm_stderr": 0.0323050384069596,
"mc1": 0.5667074663402693,
"mc1_stderr": 0.017347024450107485,
"mc2": 0.7142943510205136,
"mc2_stderr": 0.015024530295000761
},
"harness|arc:challenge|25": {
"acc": 0.6808873720136519,
"acc_stderr": 0.013621696119173307,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.01325001257939344
},
"harness|hellaswag|10": {
"acc": 0.7070304720175263,
"acc_stderr": 0.004541944342035901,
"acc_norm": 0.8815972913762199,
"acc_norm_stderr": 0.003224240722351317
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361072,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361072
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6413793103448275,
"acc_stderr": 0.039966295748767186,
"acc_norm": 0.6413793103448275,
"acc_norm_stderr": 0.039966295748767186
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47883597883597884,
"acc_stderr": 0.025728230952130726,
"acc_norm": 0.47883597883597884,
"acc_norm_stderr": 0.025728230952130726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.8,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8737373737373737,
"acc_stderr": 0.02366435940288023,
"acc_norm": 0.8737373737373737,
"acc_norm_stderr": 0.02366435940288023
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3814814814814815,
"acc_stderr": 0.029616718927497593,
"acc_norm": 0.3814814814814815,
"acc_norm_stderr": 0.029616718927497593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7184873949579832,
"acc_stderr": 0.02921354941437217,
"acc_norm": 0.7184873949579832,
"acc_norm_stderr": 0.02921354941437217
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.03388857118502325,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.03388857118502325
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8565400843881856,
"acc_stderr": 0.022818291821017012,
"acc_norm": 0.8565400843881856,
"acc_norm_stderr": 0.022818291821017012
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.02280138253459753,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.02280138253459753
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.014214138556913917,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.014214138556913917
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39329608938547483,
"acc_stderr": 0.016337268694270112,
"acc_norm": 0.39329608938547483,
"acc_norm_stderr": 0.016337268694270112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.729903536977492,
"acc_stderr": 0.02521804037341062,
"acc_norm": 0.729903536977492,
"acc_norm_stderr": 0.02521804037341062
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7901234567901234,
"acc_stderr": 0.02265834408598137,
"acc_norm": 0.7901234567901234,
"acc_norm_stderr": 0.02265834408598137
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7389705882352942,
"acc_stderr": 0.026679252270103135,
"acc_norm": 0.7389705882352942,
"acc_norm_stderr": 0.026679252270103135
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.018690850273595294,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.018690850273595294
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776334,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776334
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5667074663402693,
"mc1_stderr": 0.017347024450107485,
"mc2": 0.7142943510205136,
"mc2_stderr": 0.015024530295000761
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.01041084977522279
},
"harness|gsm8k|5": {
"acc": 0.6474601971190296,
"acc_stderr": 0.013159909755930337
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] | open-llm-leaderboard/details_upstage__SOLAR-10.7B-Instruct-v1.0 | [
"region:us"
] | 2023-12-13T21:05:25+00:00 | {"pretty_name": "Evaluation run of upstage/SOLAR-10.7B-Instruct-v1.0", "dataset_summary": "Dataset automatically created during the evaluation run of model [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\nTo load the details from a run, you can for instance do the following:\n```python\nfrom datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_upstage__SOLAR-10.7B-Instruct-v1.0\",\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\nThese are the [latest results from run 2023-12-13T21:02:33.929144](https://huggingface.co/datasets/open-llm-leaderboard/details_upstage__SOLAR-10.7B-Instruct-v1.0/blob/main/results_2023-12-13T21-02-33.929144.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6657586984797939,\n \"acc_stderr\": 0.03165995758526614,\n \"acc_norm\": 0.6666511531376961,\n \"acc_norm_stderr\": 0.0323050384069596,\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.017347024450107485,\n \"mc2\": 0.7142943510205136,\n \"mc2_stderr\": 0.015024530295000761\n },\n \"harness|arc:challenge|25\": {\n \"acc\": 0.6808873720136519,\n \"acc_stderr\": 0.013621696119173307,\n \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.01325001257939344\n },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7070304720175263,\n \"acc_stderr\": 0.004541944342035901,\n \"acc_norm\": 0.8815972913762199,\n \"acc_norm_stderr\": 0.003224240722351317\n },\n \"harness|hendrycksTest-abstract_algebra|5\": {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\": {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361072,\n \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361072\n },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\": 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236785,\n \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236785\n },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\": 0.6413793103448275,\n \"acc_stderr\": 0.039966295748767186,\n \"acc_norm\": 0.6413793103448275,\n \"acc_norm_stderr\": 0.039966295748767186\n },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.47883597883597884,\n \"acc_stderr\": 0.025728230952130726,\n \"acc_norm\": 0.47883597883597884,\n \"acc_norm_stderr\": 0.025728230952130726\n },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \"acc\": 0.8,\n \"acc_stderr\": 0.031234752377721175,\n \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.031234752377721175\n },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.8737373737373737,\n \"acc_stderr\": 0.02366435940288023,\n \"acc_norm\": 0.8737373737373737,\n \"acc_norm_stderr\": 0.02366435940288023\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"acc\": 0.3814814814814815,\n \"acc_stderr\": 0.029616718927497593,\n \"acc_norm\": 0.3814814814814815,\n \"acc_norm_stderr\": 0.029616718927497593\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"acc\": 0.7184873949579832,\n \"acc_stderr\": 0.02921354941437217,\n \"acc_norm\": 0.7184873949579832,\n \"acc_norm_stderr\": 0.02921354941437217\n },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\": 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.5555555555555556,\n \"acc_stderr\": 0.03388857118502325,\n \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.03388857118502325\n },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\": 0.8565400843881856,\n \"acc_stderr\": 0.022818291821017012,\n \"acc_norm\": 0.8565400843881856,\n \"acc_norm_stderr\": 0.022818291821017012\n },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\": {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\": 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\": {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\": {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n \"acc_stderr\": 0.02280138253459753,\n \"acc_norm\": 0.8589743589743589,\n \"acc_norm_stderr\": 0.02280138253459753\n },\n \"harness|hendrycksTest-medical_genetics|5\": {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n \"acc_stderr\": 0.014214138556913917,\n \"acc_norm\": 0.8033205619412516,\n \"acc_norm_stderr\": 0.014214138556913917\n },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39329608938547483,\n \"acc_stderr\": 0.016337268694270112,\n \"acc_norm\": 0.39329608938547483,\n \"acc_norm_stderr\": 0.016337268694270112\n },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.729903536977492,\n \"acc_stderr\": 0.02521804037341062,\n \"acc_norm\": 0.729903536977492,\n \"acc_norm_stderr\": 0.02521804037341062\n },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.7901234567901234,\n \"acc_stderr\": 0.02265834408598137,\n \"acc_norm\": 0.7901234567901234,\n \"acc_norm_stderr\": 0.02265834408598137\n },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\": 0.7389705882352942,\n \"acc_stderr\": 0.026679252270103135,\n \"acc_norm\": 0.7389705882352942,\n \"acc_norm_stderr\": 0.026679252270103135\n },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.018690850273595294,\n \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.018690850273595294\n },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776334,\n \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.030151134457776334\n },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n \"mc1_stderr\": 0.017347024450107485,\n \"mc2\": 0.7142943510205136,\n \"mc2_stderr\": 0.015024530295000761\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.01041084977522279\n },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6474601971190296,\n \"acc_stderr\": 0.013159909755930337\n }\n}\n```", "repo_url": "https://huggingface.co/upstage/SOLAR-10.7B-Instruct-v1.0", "leaderboard_url": "https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard", "point_of_contact": "[email protected]", "configs": [{"config_name": "harness_arc_challenge_25", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|arc:challenge|25_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|arc:challenge|25_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_gsm8k_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|gsm8k|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|gsm8k|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hellaswag_10", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hellaswag|10_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hellaswag|10_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-anatomy|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-astronomy|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_biology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-college_physics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-computer_security|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-econometrics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-global_facts|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-human_aging|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-international_law|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-management|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-marketing|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-nutrition|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-philosophy|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-prehistory|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-professional_law|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-public_relations|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-security_studies|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-sociology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-virology|5_2023-12-13T21-02-33.929144.parquet", "**/details_harness|hendrycksTest-world_religions|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_abstract_algebra_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_anatomy_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-anatomy|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_astronomy_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-astronomy|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_business_ethics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-business_ethics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_clinical_knowledge_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_biology_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_biology|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_chemistry_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_chemistry|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_computer_science_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_computer_science|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_mathematics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_mathematics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_medicine_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_medicine|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_college_physics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-college_physics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_computer_security_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-computer_security|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_conceptual_physics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_econometrics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-econometrics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_electrical_engineering_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_elementary_mathematics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_formal_logic_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-formal_logic|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_global_facts_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-global_facts|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_biology_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_biology|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_chemistry_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_computer_science_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_european_history_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_geography_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_geography|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_government_and_politics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_macroeconomics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_mathematics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_microeconomics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_physics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_physics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_psychology_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_statistics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_us_history_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_high_school_world_history_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_aging_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_aging|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_human_sexuality_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-human_sexuality|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_international_law_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-international_law|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_jurisprudence_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-jurisprudence|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_logical_fallacies_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_machine_learning_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-machine_learning|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_management_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-management|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_marketing_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-marketing|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_medical_genetics_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-medical_genetics|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_miscellaneous_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-miscellaneous|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_disputes_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_disputes|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_moral_scenarios_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_nutrition_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-nutrition|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_philosophy_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-philosophy|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_prehistory_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-prehistory|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_accounting_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_accounting|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_law_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_law|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_medicine_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_medicine|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_professional_psychology_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-professional_psychology|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_public_relations_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-public_relations|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_security_studies_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-security_studies|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_sociology_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-sociology|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_us_foreign_policy_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_virology_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-virology|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_hendrycksTest_world_religions_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|hendrycksTest-world_religions|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_truthfulqa_mc_0", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|truthfulqa:mc|0_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "harness_winogrande_5", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["**/details_harness|winogrande|5_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["**/details_harness|winogrande|5_2023-12-13T21-02-33.929144.parquet"]}]}, {"config_name": "results", "data_files": [{"split": "2023_12_13T21_02_33.929144", "path": ["results_2023-12-13T21-02-33.929144.parquet"]}, {"split": "latest", "path": ["results_2023-12-13T21-02-33.929144.parquet"]}]}]} | 2023-12-13T21:06:09+00:00 | [] | [] | TAGS
#region-us
|
# Dataset Card for Evaluation run of upstage/SOLAR-10.7B-Instruct-v1.0
Dataset automatically created during the evaluation run of model upstage/SOLAR-10.7B-Instruct-v1.0 on the Open LLM Leaderboard.
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).
To load the details from a run, you can for instance do the following:
## Latest results
These are the latest results from run 2023-12-13T21:02:33.929144(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
## Dataset Details
### Dataset Description
- Curated by:
- Funded by [optional]:
- Shared by [optional]:
- Language(s) (NLP):
- License:
### Dataset Sources [optional]
- Repository:
- Paper [optional]:
- Demo [optional]:
## Uses
### Direct Use
### Out-of-Scope Use
## Dataset Structure
## Dataset Creation
### Curation Rationale
### Source Data
#### Data Collection and Processing
#### Who are the source data producers?
### Annotations [optional]
#### Annotation process
#### Who are the annotators?
#### Personal and Sensitive Information
## Bias, Risks, and Limitations
### Recommendations
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
[optional]
BibTeX:
APA:
## Glossary [optional]
## More Information [optional]
## Dataset Card Authors [optional]
## Dataset Card Contact
| [
"# Dataset Card for Evaluation run of upstage/SOLAR-10.7B-Instruct-v1.0\n\n\n\nDataset automatically created during the evaluation run of model upstage/SOLAR-10.7B-Instruct-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T21:02:33.929144(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
"TAGS\n#region-us \n",
"# Dataset Card for Evaluation run of upstage/SOLAR-10.7B-Instruct-v1.0\n\n\n\nDataset automatically created during the evaluation run of model upstage/SOLAR-10.7B-Instruct-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:",
"## Latest results\n\nThese are the latest results from run 2023-12-13T21:02:33.929144(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):",
"## Dataset Details",
"### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:",
"### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:",
"## Uses",
"### Direct Use",
"### Out-of-Scope Use",
"## Dataset Structure",
"## Dataset Creation",
"### Curation Rationale",
"### Source Data",
"#### Data Collection and Processing",
"#### Who are the source data producers?",
"### Annotations [optional]",
"#### Annotation process",
"#### Who are the annotators?",
"#### Personal and Sensitive Information",
"## Bias, Risks, and Limitations",
"### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:",
"## Glossary [optional]",
"## More Information [optional]",
"## Dataset Card Authors [optional]",
"## Dataset Card Contact"
] | [
6,
193,
67,
4,
40,
29,
3,
4,
9,
6,
5,
7,
4,
7,
10,
9,
5,
9,
8,
10,
46,
8,
7,
10,
5
] | [
"passage: TAGS\n#region-us \n# Dataset Card for Evaluation run of upstage/SOLAR-10.7B-Instruct-v1.0\n\n\n\nDataset automatically created during the evaluation run of model upstage/SOLAR-10.7B-Instruct-v1.0 on the Open LLM Leaderboard.\n\nThe dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The \"train\" split is always pointing to the latest results.\n\nAn additional configuration \"results\" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the Open LLM Leaderboard).\n\nTo load the details from a run, you can for instance do the following:## Latest results\n\nThese are the latest results from run 2023-12-13T21:02:33.929144(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the \"latest\" split for each eval):## Dataset Details### Dataset Description\n\n\n\n\n\n- Curated by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Language(s) (NLP): \n- License:### Dataset Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Out-of-Scope Use## Dataset Structure## Dataset Creation### Curation Rationale### Source Data#### Data Collection and Processing#### Who are the source data producers?### Annotations [optional]#### Annotation process#### Who are the annotators?#### Personal and Sensitive Information## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Dataset Card Authors [optional]"
] |
ddbd344d6c16fdf5662a89002b596a60222aab0f | # Optimized Item Selection Datasets
We provide the datasets that are used to test the multi-level optimization framework ([CPAIOR'21](https://link.springer.com/chapter/10.1007/978-3-030-78230-6_27), [DSO@IJCAI'22](https://arxiv.org/abs/2112.03105)), for solving Item Selection Problem (ISP) to boost exploration in Recommender Systems.
The the multi-objective optimization framework is implemented in [Selective](https://github.com/fidelity/selective) as part of `TextBased Selection`. By solving the ISP with Text-based Selection in Selective, we select a smaller subset of items with maximum diversity in the latent embedding space of items and maximum coverage of labels.
The datasets are extracted and processed from their original public sources for research purposes as detailed below.
## Overview of Datasets
The datasets include:
* [**GoodReads datasets**](book_recommenders_data/) for book recommenders. Two datasets are randomly selected from the source data [GoodReads Book Reviews](https://dl.acm.org/doi/10.1145/3240323.3240369), a small version with 1000 items and a large version with 10,000 items. For book recommendations, there are 11 different genres (e.g., fiction, non-fiction, children), 231 different publishers (e.g. Vintage, Penguin Books, Mariner Books), and genre-publisher pairs. This leads to 574 and 1,322 unique book labels for the small and large datasets, respectively.
* [**MovieLens datasets**](movie_recommenders_data/) for movie recommenders. Two datasets are randomly selected from the source data [MovieLens Movie Ratings](https://dl.acm.org/doi/10.1145/2827872), a small version with 1000 items and a large version with 10,000 items. For movie recommendations, there are 19 different genres (e.g. action, comedy, drama, romance), 587 different producers, 34 different languages (e.g. English, French, Mandarin), and genre-language pairs. This leads to 473 and 1,011 unique movie labels for the small and large datasets, respectively.
Each dataset in GoodReads and MovieLens contains:
* `*_data.csv` that contains the text content (i.e., title + description) of the items, and
* `*_label.csv` that contains the labels (e.g., genre or language) and a binary 0/1 value denoting whether an item exbihits a label.
Each column in the csv file is for an item, indexed by book/movie ID. The order of columns in data and label files are the same.
## Quick Start
To run the example, install required packages by `pip install selective datasets`
```python
# Import Selective (for text-based selection) and TextWiser (for embedding space)
import pandas as pd
from datasets import load_dataset
from textwiser import TextWiser, Embedding, Transformation
from feature.selector import Selective, SelectionMethod
# Load Text Contents
data = load_dataset('skadio/optimized_item_selection', data_files='book_recommenders_data/goodreads_1k_data.csv', split='train')
data = data.to_pandas()
# Load Labels
labels = load_dataset('skadio/optimized_item_selection', data_files='book_recommenders_data/goodreads_1k_label.csv', split='train')
labels = labels.to_pandas()
labels.set_index('label', inplace=True)
# TextWiser featurization method to create text embeddings
textwiser = TextWiser(Embedding.TfIdf(), Transformation.NMF(n_components=20, random_state=1234))
# Text-based selection with the default configuration
# The default configuration is optimization_method="exact" and cost_metric ="diverse"
# By default, multi-level optimization maximizes coverage and diversity as described in (CPAIOR'21, DSO@IJCAI'22)
# within an upper bound on subset size given as num_features
selector = Selective(SelectionMethod.TextBased(num_features=30, featurization_method=textwiser))
# Result
subset = selector.fit_transform(data, labels)
print("Reduction:", list(subset.columns))
```
## Advanced Usages
Text-based Selection provides access to multiple selection methods.
At a high-level, the configurations can be divided into exact, randomized, greedy or cluster-based optimization.
### Exact
- (Default) Solve for Problem *P_max_cover@t* in **CPAIOR'21** - Selecting a subset of items that
maximizes coverage of labels and maximizes the diversity in latent embedding space within an upper
bound on subset size.
```python
selector = Selective(SelectionMethod.TextBased(num_features=30,
featurization_method=textwiser,
optimization_method='exact',
cost_metric='diverse'))
```
- Solve for Problem *P_unicost* in **CPAIOR'21** - Selecting a subset of items that covers all labels.
```python
selector = Selective(SelectionMethod.TextBased(num_features=None,
optimization_method='exact',
cost_metric='unicost'))
```
- Solve for Problem *P_diverse* in **CPAIOR'21** - Selecting a subset of items with maximized diversity
in the latent embedding space while still maintaining the coverage over all labels.
```python
selector = Selective(SelectionMethod.TextBased(num_features=None,
featurization_method=textwiser,
optimization_method='exact',
cost_metric='diverse'))
```
- Selecting a subset of items that only maximizes coverage within an upper bound on subset size.
```python
selector = Selective(SelectionMethod.TextBased(num_features=30,
optimization_method='exact',
cost_metric='unicost'))
```
### Randomized
- Selecting a subset by performing random selection. If num_features is not set, subset size is defined
by solving *P_unicost*.
```python
selector = Selective(SelectionMethod.TextBased(num_features=None, optimization_method='random'))
```
- Selecting a subset by performing random selection. Subset size is defined by num_features.
```python
selector = Selective(SelectionMethod.TextBased(num_features=30,
optimization_method='random'))
```
### Greedy
- Selecting a subset by adding an item each time using a greedy heuristic in selection with a given
cost_metric, i.e. `diverse` by default or `unicost`. If num_features is not set, subset size is defined
by solving *P_unicost*.
```python
selector = Selective(SelectionMethod.TextBased(num_features=None,
optimization_method='greedy',
cost_metric='unicost'))
```
- Selecting a subset by adding an item each time using a greedy heuristic in selection with a given
cost_metric, i.e. `diverse` by default or `unicost`.
```python
selector = Selective(SelectionMethod.TextBased(num_features=30,
optimization_method='greedy',
cost_metric='unicost'))
```
### Clustering
- Selecting a subset by clustering items into a number of clusters and the items close to the centroids
are selected. If num_features is not set, subset size is defined by solving *P_unicost*. `cost_metric` argument
is not used in this method.
```python
selector = Selective(SelectionMethod.TextBased(num_features=None, optimization_method='kmeans'))
```
- Selecting a subset by clustering items into a number of clusters and the items close to the centroids
are selected. `cost_metric` argument is not used in this method.
```python
selector = Selective(SelectionMethod.TextBased(num_features=30,
optimization_method='kmeans'))
```
## Citation
If you use ISP in our research/applications, please cite as follows:
```bibtex
@inproceedings{cpaior2021,
title={Optimized Item Selection to Boost Exploration for Recommender Systems},
author={Serdar Kadıoğlu and Bernard Kleynhans and Xin Wang},
booktitle={Proceedings of Integration of Constraint Programming, Artificial Intelligence, and Operations Research: 18th International Conference, CPAIOR 2021, Vienna, Austria, July 5–8, 2021},
url={https://doi.org/10.1007/978-3-030-78230-6_27},
pages = {427–445},
year={2021}
}
```
```bibtex
@inproceedings{ijcai2022,
title={Active Learning Meets Optimized Item Selection},
author={Bernard Kleynhans and Xin Wang and Serdar Kadıoğlu},
booktitle={The IJCAI-22 Workshop: Data Science meets Optimisation}
publisher={arXiv},
url={https://arxiv.org/abs/2112.03105},
year={2022}
}
``` | skadio/optimized_item_selection | [
"arxiv:2112.03105",
"region:us"
] | 2023-12-13T21:07:33+00:00 | {} | 2024-01-05T11:37:09+00:00 | [
"2112.03105"
] | [] | TAGS
#arxiv-2112.03105 #region-us
| # Optimized Item Selection Datasets
We provide the datasets that are used to test the multi-level optimization framework (CPAIOR'21, DSO@IJCAI'22), for solving Item Selection Problem (ISP) to boost exploration in Recommender Systems.
The the multi-objective optimization framework is implemented in Selective as part of 'TextBased Selection'. By solving the ISP with Text-based Selection in Selective, we select a smaller subset of items with maximum diversity in the latent embedding space of items and maximum coverage of labels.
The datasets are extracted and processed from their original public sources for research purposes as detailed below.
## Overview of Datasets
The datasets include:
* GoodReads datasets for book recommenders. Two datasets are randomly selected from the source data GoodReads Book Reviews, a small version with 1000 items and a large version with 10,000 items. For book recommendations, there are 11 different genres (e.g., fiction, non-fiction, children), 231 different publishers (e.g. Vintage, Penguin Books, Mariner Books), and genre-publisher pairs. This leads to 574 and 1,322 unique book labels for the small and large datasets, respectively.
* MovieLens datasets for movie recommenders. Two datasets are randomly selected from the source data MovieLens Movie Ratings, a small version with 1000 items and a large version with 10,000 items. For movie recommendations, there are 19 different genres (e.g. action, comedy, drama, romance), 587 different producers, 34 different languages (e.g. English, French, Mandarin), and genre-language pairs. This leads to 473 and 1,011 unique movie labels for the small and large datasets, respectively.
Each dataset in GoodReads and MovieLens contains:
* '*_data.csv' that contains the text content (i.e., title + description) of the items, and
* '*_label.csv' that contains the labels (e.g., genre or language) and a binary 0/1 value denoting whether an item exbihits a label.
Each column in the csv file is for an item, indexed by book/movie ID. The order of columns in data and label files are the same.
## Quick Start
To run the example, install required packages by 'pip install selective datasets'
## Advanced Usages
Text-based Selection provides access to multiple selection methods.
At a high-level, the configurations can be divided into exact, randomized, greedy or cluster-based optimization.
### Exact
- (Default) Solve for Problem *P_max_cover@t* in CPAIOR'21 - Selecting a subset of items that
maximizes coverage of labels and maximizes the diversity in latent embedding space within an upper
bound on subset size.
- Solve for Problem *P_unicost* in CPAIOR'21 - Selecting a subset of items that covers all labels.
- Solve for Problem *P_diverse* in CPAIOR'21 - Selecting a subset of items with maximized diversity
in the latent embedding space while still maintaining the coverage over all labels.
- Selecting a subset of items that only maximizes coverage within an upper bound on subset size.
### Randomized
- Selecting a subset by performing random selection. If num_features is not set, subset size is defined
by solving *P_unicost*.
- Selecting a subset by performing random selection. Subset size is defined by num_features.
### Greedy
- Selecting a subset by adding an item each time using a greedy heuristic in selection with a given
cost_metric, i.e. 'diverse' by default or 'unicost'. If num_features is not set, subset size is defined
by solving *P_unicost*.
- Selecting a subset by adding an item each time using a greedy heuristic in selection with a given
cost_metric, i.e. 'diverse' by default or 'unicost'.
### Clustering
- Selecting a subset by clustering items into a number of clusters and the items close to the centroids
are selected. If num_features is not set, subset size is defined by solving *P_unicost*. 'cost_metric' argument
is not used in this method.
- Selecting a subset by clustering items into a number of clusters and the items close to the centroids
are selected. 'cost_metric' argument is not used in this method.
If you use ISP in our research/applications, please cite as follows:
| [
"# Optimized Item Selection Datasets\n\nWe provide the datasets that are used to test the multi-level optimization framework (CPAIOR'21, DSO@IJCAI'22), for solving Item Selection Problem (ISP) to boost exploration in Recommender Systems. \n\nThe the multi-objective optimization framework is implemented in Selective as part of 'TextBased Selection'. By solving the ISP with Text-based Selection in Selective, we select a smaller subset of items with maximum diversity in the latent embedding space of items and maximum coverage of labels.\n\nThe datasets are extracted and processed from their original public sources for research purposes as detailed below.",
"## Overview of Datasets\nThe datasets include:\n\n* GoodReads datasets for book recommenders. Two datasets are randomly selected from the source data GoodReads Book Reviews, a small version with 1000 items and a large version with 10,000 items. For book recommendations, there are 11 different genres (e.g., fiction, non-fiction, children), 231 different publishers (e.g. Vintage, Penguin Books, Mariner Books), and genre-publisher pairs. This leads to 574 and 1,322 unique book labels for the small and large datasets, respectively.\n\n* MovieLens datasets for movie recommenders. Two datasets are randomly selected from the source data MovieLens Movie Ratings, a small version with 1000 items and a large version with 10,000 items. For movie recommendations, there are 19 different genres (e.g. action, comedy, drama, romance), 587 different producers, 34 different languages (e.g. English, French, Mandarin), and genre-language pairs. This leads to 473 and 1,011 unique movie labels for the small and large datasets, respectively.\n\nEach dataset in GoodReads and MovieLens contains:\n* '*_data.csv' that contains the text content (i.e., title + description) of the items, and\n* '*_label.csv' that contains the labels (e.g., genre or language) and a binary 0/1 value denoting whether an item exbihits a label. \n\nEach column in the csv file is for an item, indexed by book/movie ID. The order of columns in data and label files are the same.",
"## Quick Start\nTo run the example, install required packages by 'pip install selective datasets'",
"## Advanced Usages\nText-based Selection provides access to multiple selection methods. \n\nAt a high-level, the configurations can be divided into exact, randomized, greedy or cluster-based optimization.",
"### Exact\n\n- (Default) Solve for Problem *P_max_cover@t* in CPAIOR'21 - Selecting a subset of items that \nmaximizes coverage of labels and maximizes the diversity in latent embedding space within an upper \nbound on subset size.\n\n- Solve for Problem *P_unicost* in CPAIOR'21 - Selecting a subset of items that covers all labels.\n\n- Solve for Problem *P_diverse* in CPAIOR'21 - Selecting a subset of items with maximized diversity \nin the latent embedding space while still maintaining the coverage over all labels.\n\n- Selecting a subset of items that only maximizes coverage within an upper bound on subset size.",
"### Randomized\n\n- Selecting a subset by performing random selection. If num_features is not set, subset size is defined \nby solving *P_unicost*.\n\n- Selecting a subset by performing random selection. Subset size is defined by num_features.",
"### Greedy\n\n- Selecting a subset by adding an item each time using a greedy heuristic in selection with a given\ncost_metric, i.e. 'diverse' by default or 'unicost'. If num_features is not set, subset size is defined \nby solving *P_unicost*.\n\n- Selecting a subset by adding an item each time using a greedy heuristic in selection with a given\ncost_metric, i.e. 'diverse' by default or 'unicost'.",
"### Clustering\n\n- Selecting a subset by clustering items into a number of clusters and the items close to the centroids \nare selected. If num_features is not set, subset size is defined by solving *P_unicost*. 'cost_metric' argument \nis not used in this method.\n\n- Selecting a subset by clustering items into a number of clusters and the items close to the centroids \nare selected. 'cost_metric' argument is not used in this method.\n\n\nIf you use ISP in our research/applications, please cite as follows:"
] | [
"TAGS\n#arxiv-2112.03105 #region-us \n",
"# Optimized Item Selection Datasets\n\nWe provide the datasets that are used to test the multi-level optimization framework (CPAIOR'21, DSO@IJCAI'22), for solving Item Selection Problem (ISP) to boost exploration in Recommender Systems. \n\nThe the multi-objective optimization framework is implemented in Selective as part of 'TextBased Selection'. By solving the ISP with Text-based Selection in Selective, we select a smaller subset of items with maximum diversity in the latent embedding space of items and maximum coverage of labels.\n\nThe datasets are extracted and processed from their original public sources for research purposes as detailed below.",
"## Overview of Datasets\nThe datasets include:\n\n* GoodReads datasets for book recommenders. Two datasets are randomly selected from the source data GoodReads Book Reviews, a small version with 1000 items and a large version with 10,000 items. For book recommendations, there are 11 different genres (e.g., fiction, non-fiction, children), 231 different publishers (e.g. Vintage, Penguin Books, Mariner Books), and genre-publisher pairs. This leads to 574 and 1,322 unique book labels for the small and large datasets, respectively.\n\n* MovieLens datasets for movie recommenders. Two datasets are randomly selected from the source data MovieLens Movie Ratings, a small version with 1000 items and a large version with 10,000 items. For movie recommendations, there are 19 different genres (e.g. action, comedy, drama, romance), 587 different producers, 34 different languages (e.g. English, French, Mandarin), and genre-language pairs. This leads to 473 and 1,011 unique movie labels for the small and large datasets, respectively.\n\nEach dataset in GoodReads and MovieLens contains:\n* '*_data.csv' that contains the text content (i.e., title + description) of the items, and\n* '*_label.csv' that contains the labels (e.g., genre or language) and a binary 0/1 value denoting whether an item exbihits a label. \n\nEach column in the csv file is for an item, indexed by book/movie ID. The order of columns in data and label files are the same.",
"## Quick Start\nTo run the example, install required packages by 'pip install selective datasets'",
"## Advanced Usages\nText-based Selection provides access to multiple selection methods. \n\nAt a high-level, the configurations can be divided into exact, randomized, greedy or cluster-based optimization.",
"### Exact\n\n- (Default) Solve for Problem *P_max_cover@t* in CPAIOR'21 - Selecting a subset of items that \nmaximizes coverage of labels and maximizes the diversity in latent embedding space within an upper \nbound on subset size.\n\n- Solve for Problem *P_unicost* in CPAIOR'21 - Selecting a subset of items that covers all labels.\n\n- Solve for Problem *P_diverse* in CPAIOR'21 - Selecting a subset of items with maximized diversity \nin the latent embedding space while still maintaining the coverage over all labels.\n\n- Selecting a subset of items that only maximizes coverage within an upper bound on subset size.",
"### Randomized\n\n- Selecting a subset by performing random selection. If num_features is not set, subset size is defined \nby solving *P_unicost*.\n\n- Selecting a subset by performing random selection. Subset size is defined by num_features.",
"### Greedy\n\n- Selecting a subset by adding an item each time using a greedy heuristic in selection with a given\ncost_metric, i.e. 'diverse' by default or 'unicost'. If num_features is not set, subset size is defined \nby solving *P_unicost*.\n\n- Selecting a subset by adding an item each time using a greedy heuristic in selection with a given\ncost_metric, i.e. 'diverse' by default or 'unicost'.",
"### Clustering\n\n- Selecting a subset by clustering items into a number of clusters and the items close to the centroids \nare selected. If num_features is not set, subset size is defined by solving *P_unicost*. 'cost_metric' argument \nis not used in this method.\n\n- Selecting a subset by clustering items into a number of clusters and the items close to the centroids \nare selected. 'cost_metric' argument is not used in this method.\n\n\nIf you use ISP in our research/applications, please cite as follows:"
] | [
15,
158,
382,
23,
45,
175,
66,
120,
130
] | [
"passage: TAGS\n#arxiv-2112.03105 #region-us \n# Optimized Item Selection Datasets\n\nWe provide the datasets that are used to test the multi-level optimization framework (CPAIOR'21, DSO@IJCAI'22), for solving Item Selection Problem (ISP) to boost exploration in Recommender Systems. \n\nThe the multi-objective optimization framework is implemented in Selective as part of 'TextBased Selection'. By solving the ISP with Text-based Selection in Selective, we select a smaller subset of items with maximum diversity in the latent embedding space of items and maximum coverage of labels.\n\nThe datasets are extracted and processed from their original public sources for research purposes as detailed below.",
"passage: ## Overview of Datasets\nThe datasets include:\n\n* GoodReads datasets for book recommenders. Two datasets are randomly selected from the source data GoodReads Book Reviews, a small version with 1000 items and a large version with 10,000 items. For book recommendations, there are 11 different genres (e.g., fiction, non-fiction, children), 231 different publishers (e.g. Vintage, Penguin Books, Mariner Books), and genre-publisher pairs. This leads to 574 and 1,322 unique book labels for the small and large datasets, respectively.\n\n* MovieLens datasets for movie recommenders. Two datasets are randomly selected from the source data MovieLens Movie Ratings, a small version with 1000 items and a large version with 10,000 items. For movie recommendations, there are 19 different genres (e.g. action, comedy, drama, romance), 587 different producers, 34 different languages (e.g. English, French, Mandarin), and genre-language pairs. This leads to 473 and 1,011 unique movie labels for the small and large datasets, respectively.\n\nEach dataset in GoodReads and MovieLens contains:\n* '*_data.csv' that contains the text content (i.e., title + description) of the items, and\n* '*_label.csv' that contains the labels (e.g., genre or language) and a binary 0/1 value denoting whether an item exbihits a label. \n\nEach column in the csv file is for an item, indexed by book/movie ID. The order of columns in data and label files are the same.## Quick Start\nTo run the example, install required packages by 'pip install selective datasets'## Advanced Usages\nText-based Selection provides access to multiple selection methods. \n\nAt a high-level, the configurations can be divided into exact, randomized, greedy or cluster-based optimization.### Exact\n\n- (Default) Solve for Problem *P_max_cover@t* in CPAIOR'21 - Selecting a subset of items that \nmaximizes coverage of labels and maximizes the diversity in latent embedding space within an upper \nbound on subset size.\n\n- Solve for Problem *P_unicost* in CPAIOR'21 - Selecting a subset of items that covers all labels.\n\n- Solve for Problem *P_diverse* in CPAIOR'21 - Selecting a subset of items with maximized diversity \nin the latent embedding space while still maintaining the coverage over all labels.\n\n- Selecting a subset of items that only maximizes coverage within an upper bound on subset size.### Randomized\n\n- Selecting a subset by performing random selection. If num_features is not set, subset size is defined \nby solving *P_unicost*.\n\n- Selecting a subset by performing random selection. Subset size is defined by num_features.### Greedy\n\n- Selecting a subset by adding an item each time using a greedy heuristic in selection with a given\ncost_metric, i.e. 'diverse' by default or 'unicost'. If num_features is not set, subset size is defined \nby solving *P_unicost*.\n\n- Selecting a subset by adding an item each time using a greedy heuristic in selection with a given\ncost_metric, i.e. 'diverse' by default or 'unicost'."
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.